Data Exploration
Load the dataset and pre-process the data.
disp(HCV)
Age Gender BMI Fever NauseaVomting Headache Diarrhea Fatiguegeneralizedboneache Jaundice Epigastricpain WBC RBC HGB Plat AST1 ALT1 ALT4 ALT12 ALT24 ALT36 ALT48 ALTafter24w RNABase RNA4 RNA12 RNAEOT RNAEF BaselinehistologicalGrading Baselinehistologicalstaging
___ ______ ___ _____ _____________ ________ ________ __________________________ ________ ______________ _____ __________ ___ __________ ____ ____ ____ _____ _____ _____ _____ ___________ __________ __________ __________ __________ __________ ___________________________ ___________________________
56 1 35 2 1 1 1 2 2 2 7425 4.2488e+06 14 1.1213e+05 99 84 52 109 81 5 5 5 6.5533e+05 6.3454e+05 2.8819e+05 5 5 13 2
46 1 29 1 2 2 1 2 2 1 12101 4.4294e+06 10 1.2937e+05 91 123 95 75 113 57 123 44 40620 5.3864e+05 6.3706e+05 3.368e+05 31085 4 2
57 1 33 2 2 2 2 1 1 1 4178 4.6212e+06 12 1.5152e+05 113 49 95 107 116 5 5 5 5.7115e+05 6.6135e+05 5 7.3595e+05 5.5883e+05 4 4
49 2 33 1 2 1 2 1 2 1 6490 4.7946e+06 10 1.4646e+05 43 64 109 80 88 48 77 33 1.0419e+06 4.4994e+05 5.8569e+05 7.4446e+05 5.823e+05 10 3
59 1 32 1 1 2 1 2 2 2 3661 4.6064e+06 11 1.8768e+05 99 104 67 48 120 94 90 30 6.6041e+05 7.3876e+05 3.7315e+06 3.3895e+05 2.4286e+05 11 1
58 2 22 2 2 2 1 2 2 1 11785 3.8825e+06 15 1.3123e+05 66 104 121 96 65 73 114 29 1.1575e+06 1.0869e+06 5 5 5 4 4
42 2 26 1 1 2 2 2 2 2 11620 4.7473e+06 12 1.7726e+05 78 57 113 118 107 84 80 28 3.2569e+05 1.034e+06 2.751e+05 2.1457e+05 6.3516e+05 12 4
48 2 30 1 1 2 2 1 1 2 7335 4.4059e+06 11 2.1618e+05 119 112 80 127 45 96 53 39 6.4113e+05 72050 7.873e+05 3.7061e+05 5.063e+05 12 3
44 1 23 1 1 2 2 2 1 2 10480 4.6085e+06 12 1.4889e+05 93 83 55 102 97 122 39 45 5.9144e+05 7.5736e+05 5 3.7109e+05 2.0304e+05 5 2
45 1 30 2 1 2 2 1 1 2 6681 4.4553e+06 12 98200 55 68 72 127 81 125 43 30 1.1512e+06 2.3049e+05 2.6732e+05 2.753e+05 5.5552e+05 4 2
37 2 24 2 1 2 1 2 2 1 4437 4.265e+06 12 1.6603e+05 103 124 111 74 53 123 101 33 1.0231e+06 1.0319e+05 7.3193e+05 4.4847e+05 59998 15 2
36 1 22 2 2 1 1 1 1 1 6052 4.1302e+06 13 1.4427e+05 75 49 93 52 46 46 59 45 1.3771e+05 1.123e+06 5.6144e+05 63145 8.062e+05 16 1
45 2 25 2 1 1 1 2 1 2 9279 4.1169e+06 13 2.03e+05 97 101 66 53 95 55 104 26 9.3644e+05 5.3697e+05 5 5 5 8 1
34 1 22 1 2 1 1 2 2 1 5638 4.3216e+06 14 1.4111e+05 120 61 64 51 78 90 113 23 3.9298e+05 8.8432e+05 5.8683e+05 1.8278e+05 7.8215e+05 9 2
40 2 32 2 2 2 1 2 1 1 11507 4.1656e+06 14 2.2287e+05 127 122 106 105 88 111 111 36 1.1337e+06 1.1119e+06 4.213e+05 4.3754e+05 1.2461e+05 8 2
58 1 34 2 1 1 1 2 1 1 8035 4.8965e+06 11 1.4951e+05 117 53 50 80 120 66 86 34 6.1495e+05 3.143e+05 83690 6.7149e+05 1.3515e+05 15 1
61 1 35 1 2 2 2 1 1 2 10843 4.1652e+06 10 1.9764e+05 86 105 70 86 83 87 47 33 9.001e+05 7.2146e+05 5 5 5 3 4
55 2 24 2 1 2 2 2 2 2 8476 4.4669e+06 14 1.6328e+05 53 101 50 95 112 97 68 27 1.1453e+06 2.3099e+05 4.5788e+05 3.1836e+05 2.5642e+05 4 3
56 1 27 1 2 2 2 2 2 2 6599 4.4485e+06 15 1.9064e+05 53 124 62 76 57 46 93 26 5.0676e+05 3.5918e+05 7.434e+05 4.0518e+05 1.6298e+05 6 4
35 2 23 2 2 1 1 1 1 2 4845 4.436e+06 10 1.1182e+05 115 121 63 127 95 124 93 42 1.0805e+06 76404 7.1716e+05 4.0431e+05 4.7772e+05 16 4
57 2 23 1 1 2 2 1 1 1 5925 4.0316e+06 15 1.1656e+05 86 109 118 119 55 103 84 32 1.6962e+05 7.8602e+05 6.6908e+05 5.3119e+05 2.8252e+05 6 2
33 1 25 2 1 2 2 2 2 2 9952 4.9947e+06 10 1.0902e+05 84 77 67 81 117 68 42 32 1.1352e+06 5.7275e+05 5 5 5 4 1
41 1 23 1 2 2 2 2 1 2 7961 4.5955e+06 14 94733 45 92 103 104 40 115 93 33 2.9338e+05 4.4058e+05 53098 18292 1.8734e+05 10 3
39 2 29 1 2 1 2 1 1 2 7136 4.6252e+06 10 2.1136e+05 70 102 76 58 111 95 58 25 9.9394e+05 9.9265e+05 96482 3.349e+05 7.6276e+05 15 4
33 2 24 1 2 2 2 2 1 2 6057 4.3008e+06 11 2.2214e+05 62 91 116 128 41 70 106 43 2.4343e+05 9.8137e+05 12504 3.6002e+05 7.5338e+05 6 3
43 2 34 2 2 2 1 1 1 1 6648 4.5293e+06 15 1.0987e+05 48 112 99 85 59 87 78 35 9.553e+05 5.4065e+05 5 5 5 9 4
51 1 34 2 1 2 2 1 1 2 11032 4.0526e+06 15 94503 41 54 128 64 71 89 87 34 7.6636e+05 5.3127e+05 7.376e+05 7.3486e+05 3.7284e+05 5 1
39 2 33 2 1 2 1 1 1 2 5234 4.9062e+06 12 1.9031e+05 61 120 113 75 88 114 99 43 4.8647e+05 45990 45578 7.3329e+05 19572 15 2
57 2 26 1 2 2 1 1 1 2 6038 4.7633e+06 13 1.2672e+05 51 118 98 42 93 53 83 45 2.8537e+05 1.8666e+05 5 5 5 9 3
47 2 29 1 1 2 1 2 1 2 5846 4.7535e+06 15 1.0473e+05 120 72 117 126 45 95 49 38 4.2614e+05 2.4778e+05 7.6702e+05 3.7712e+05 3.1515e+05 9 1
55 2 33 1 2 2 1 2 1 2 5383 3.9994e+06 15 1.8226e+05 96 49 59 88 62 58 81 41 1.1943e+06 9.2868e+05 29778 1.2425e+05 2.4405e+05 7 1
58 2 35 2 2 2 2 1 1 1 7378 3.9989e+06 10 2.0111e+05 57 110 128 96 69 105 72 26 5.5771e+05 2.8771e+05 6.2359e+05 66891 35044 5 2
47 2 25 2 1 2 2 2 1 2 7486 4.5995e+06 11 1.6735e+05 94 64 54 122 64 64 96 24 6.0406e+05 4.1631e+05 3.2335e+05 7.1666e+05 6.7855e+05 8 1
61 1 33 1 2 2 2 1 1 1 11770 4.5811e+06 13 1.2564e+05 42 47 82 102 48 76 53 34 1.1599e+06 3.1851e+05 4.6326e+05 3.8135e+05 2.8291e+05 10 1
37 1 27 2 2 1 2 2 2 2 6441 4.0755e+06 11 1.1874e+05 42 118 67 111 48 107 101 45 2.726e+05 91626 4.0452e+05 6.741e+05 2.4255e+05 11 3
41 1 29 1 2 1 1 2 1 2 10304 4.1526e+06 14 1.2081e+05 128 102 79 63 80 86 127 40 1.1652e+06 51508 3.6718e+05 5.8801e+05 7.4633e+05 16 3
60 2 32 2 2 1 2 2 2 1 7365 4.0232e+06 14 2.2247e+05 52 126 67 126 126 41 54 35 1.124e+05 4.8911e+05 4.6164e+05 3.3601e+05 2.8726e+05 5 2
54 1 29 1 1 1 2 2 1 1 10704 4.9116e+06 10 1.7173e+05 40 43 46 64 101 45 91 43 47190 5.81e+05 7.8978e+05 2.6294e+05 1.1897e+05 15 1
40 2 28 2 1 2 1 1 1 1 3009 4.3542e+06 14 95604 69 58 62 50 60 84 114 31 9.6129e+05 71146 28241 31034 1417 9 2
32 1 31 1 2 1 1 1 1 2 9956 3.9395e+06 14 1.9643e+05 78 81 90 48 68 83 128 39 8.551e+05 1.0252e+05 4.0731e+05 2.2001e+05 4.055e+05 7 4
58 2 33 1 2 2 2 1 2 2 6627 3.8896e+06 14 1.829e+05 106 69 127 99 47 103 111 33 1.0475e+06 3.2035e+05 3.4945e+05 5.4683e+05 6.4394e+05 13 1
37 2 23 2 2 1 1 2 2 1 10393 4.3086e+06 11 1.8411e+05 93 56 40 124 101 50 90 33 2.7135e+05 2.0633e+05 1.5122e+05 3.0773e+05 1.7452e+05 6 1
58 1 23 1 1 1 2 1 1 2 10236 4.7979e+06 14 1.0151e+05 127 92 94 113 96 126 108 39 2.7251e+05 1.0612e+06 2.3095e+05 2.02e+05 2.938e+05 14 3
36 1 23 2 2 1 2 2 2 2 4387 4.7359e+06 14 1.7768e+05 80 48 76 120 82 111 101 43 5.9425e+05 1.1569e+06 4.3651e+05 7.2828e+05 6.8529e+05 5 4
47 2 35 1 2 2 1 2 2 2 11924 3.9025e+06 14 97785 88 89 47 48 63 57 114 26 6.5167e+05 4.2273e+05 4.1209e+05 91529 3.7639e+05 12 3
50 1 33 2 2 1 1 2 1 2 10140 4.5704e+06 10 2.0983e+05 105 116 106 111 84 107 71 35 57911 8.6779e+05 7.5877e+05 3.1969e+05 1.5976e+05 13 2
44 1 31 1 1 1 1 2 1 2 3470 4.2125e+06 12 1.5034e+05 114 68 127 47 117 128 53 32 7.5107e+05 8.2558e+05 3.5592e+05 1.0095e+05 6.3417e+05 11 4
43 1 33 1 1 2 2 2 1 2 5420 4.282e+06 10 1.8699e+05 113 40 42 42 78 106 57 28 7.405e+05 1.1974e+06 1.8045e+05 5.2456e+05 2.9181e+05 15 2
54 1 33 2 1 1 2 1 2 1 6963 4.9724e+06 10 1.894e+05 125 48 118 59 70 105 110 25 8.5158e+05 6.9952e+05 6.784e+05 7.6577e+05 5.4642e+05 9 3
59 2 26 2 1 1 1 1 1 2 6249 4.3276e+06 12 1.381e+05 100 109 123 85 95 90 72 27 6.8607e+05 49524 61646 7.0453e+05 2.4327e+05 15 1
33 2 31 1 1 1 2 1 2 1 5094 4.6796e+06 15 1.233e+05 84 71 62 117 82 116 105 42 9.7329e+05 9.9875e+05 7.5606e+05 1.7506e+05 2.4159e+05 8 4
56 2 23 1 1 1 2 2 2 2 4797 4.0542e+06 14 1.7591e+05 70 58 100 112 80 76 101 43 5.0885e+05 1.3074e+05 43228 3.0062e+05 92124 13 2
41 1 33 2 1 1 1 2 2 2 5041 4.1432e+06 13 1.2087e+05 115 93 109 48 56 76 100 43 1.1898e+05 3.6624e+05 13412 5.7512e+05 3.316e+05 8 1
59 1 32 2 1 1 1 1 2 2 6901 4.4217e+06 12 2.1228e+05 84 41 85 97 44 89 125 32 2.5385e+05 2.5791e+05 3.8231e+05 6.7271e+05 3.8516e+05 5 2
47 1 27 2 2 1 1 1 1 1 7256 3.9396e+06 14 1.978e+05 55 84 124 59 113 44 67 44 5.9866e+05 5.7407e+05 3.5612e+05 78520 6.0707e+05 13 1
50 2 34 1 1 1 2 2 1 1 8219 4.0035e+06 14 2.177e+05 49 54 127 60 39 110 46 23 1.4965e+05 2.0861e+05 2.6658e+05 53715 4.8138e+05 11 2
39 2 30 1 2 1 1 1 1 1 4418 4.6514e+06 10 1.8401e+05 82 75 53 75 95 96 44 32 2.3267e+05 8.802e+05 1.7504e+05 1.8127e+05 1.8415e+05 14 3
48 1 33 1 1 2 1 1 1 2 6358 4.1492e+06 15 1.2098e+05 40 67 40 122 71 117 111 30 3.5056e+05 1.505e+05 5.8508e+05 6325 5.3733e+05 13 4
32 2 27 1 1 1 1 2 1 2 8669 4.6162e+06 11 1.3865e+05 75 46 56 120 84 91 56 23 5.0295e+05 5.9787e+05 3.3942e+05 1.1695e+05 7.9396e+05 13 1
33 2 24 1 1 1 2 1 1 2 9435 4.1163e+06 13 1.3034e+05 79 63 110 54 41 102 41 27 1.0106e+06 1.1522e+06 93316 6.3947e+05 63629 3 1
51 2 26 2 2 2 2 2 2 2 11144 4.8441e+06 10 1.651e+05 73 73 60 82 107 79 42 32 1.9234e+05 4.6814e+05 97485 2.6912e+05 1.6345e+05 8 4
50 2 23 2 1 2 1 1 2 2 5060 4.4344e+06 13 1.3825e+05 78 63 58 105 113 57 89 29 3.6543e+05 2.3943e+05 7.5551e+05 2.9027e+05 6.4382e+05 12 3
42 2 23 1 1 2 2 1 1 1 7766 3.962e+06 13 1.2563e+05 109 89 42 60 42 72 55 26 6.201e+05 5.4474e+05 7.4064e+05 4.6846e+05 2.0832e+05 12 2
48 2 33 2 1 2 2 2 1 2 10879 4.0978e+06 15 1.638e+05 104 126 107 74 103 97 114 38 1.0596e+05 2.1608e+05 2.9709e+05 7.5566e+05 7.8007e+05 9 1
45 1 31 1 2 2 2 1 2 2 11490 3.879e+06 15 1.7781e+05 121 65 71 44 100 128 110 27 4.2092e+05 1.1096e+06 6.557e+05 7.5341e+05 5.5461e+05 9 2
58 1 28 1 2 2 2 1 1 1 4082 4.7311e+06 13 2.1959e+05 98 86 53 118 128 103 119 26 1.7517e+05 68868 6.3267e+05 2.0772e+05 2.5893e+05 10 4
36 2 31 1 2 1 2 2 2 1 5078 4.6562e+06 15 1.9904e+05 89 98 44 59 124 41 45 28 1.1778e+06 7.1416e+05 93094 3.1418e+05 1.0276e+05 5 1
50 2 22 2 1 2 1 1 2 2 4580 4.489e+06 14 1.7049e+05 70 103 52 90 119 41 57 29 5.9024e+05 2.9061e+05 2.5659e+05 94684 5.4405e+05 14 4
40 2 23 1 1 1 2 2 1 2 7983 4.8722e+06 10 2.1756e+05 106 127 94 89 98 80 61 35 7.2781e+05 4.6575e+05 3.8292e+05 7.7546e+05 24476 4 2
37 1 27 2 2 1 1 2 2 2 5500 3.9777e+06 14 2.0349e+05 119 44 57 112 109 104 56 33 9.3676e+05 7.3754e+05 2.2739e+05 4.1016e+05 6.3676e+05 15 3
39 1 35 1 2 2 1 1 2 1 3956 4.3163e+06 15 1.1227e+05 107 127 98 116 91 114 76 41 6.3289e+05 4.502e+05 3.6842e+05 3.7663e+05 7.0399e+05 5 2
54 2 27 2 1 1 2 1 1 2 9532 4.5881e+06 15 1.6256e+05 57 78 116 114 40 56 95 25 6.7097e+05 1.1588e+06 7.0653e+05 2.5998e+05 1.9053e+05 12 1
43 2 23 2 2 1 2 2 1 1 3555 4.6022e+06 13 1.599e+05 92 61 66 80 83 80 106 36 3.2771e+05 9.2791e+05 5.5474e+05 3.7509e+05 1.5753e+05 4 3
61 1 25 1 1 2 2 2 2 2 4316 4.4792e+06 12 1.5153e+05 71 113 83 94 45 49 128 30 1.041e+06 99768 7.8782e+05 4.1362e+05 3.1702e+05 6 4
44 2 28 2 2 1 2 2 2 2 7045 4.622e+06 14 1.0125e+05 95 127 57 124 87 120 75 40 2.6582e+05 813 5.2e+05 4.8402e+05 4.5642e+05 7 4
59 1 25 1 2 1 1 1 1 2 7940 4.3303e+06 15 1.516e+05 39 101 91 113 95 110 44 32 9.2202e+05 1.1469e+06 21081 1.3626e+05 1.7217e+05 13 1
32 2 24 2 1 1 2 1 2 2 11994 4.1947e+06 15 1.7292e+05 79 62 104 58 94 115 89 22 1.6861e+05 1.3724e+05 5.5027e+05 3.0163e+05 2.3461e+05 8 4
32 1 26 1 2 2 2 2 1 2 8870 4.8186e+06 14 1.3569e+05 101 63 88 73 120 39 102 24 1.1145e+06 5.1834e+05 6.1156e+05 4.5609e+05 4.0275e+05 15 1
34 1 31 2 2 1 2 1 2 1 4250 4.4705e+06 11 2.2066e+05 83 55 127 55 46 110 121 45 1.7053e+05 1.0885e+06 72854 7.9502e+05 4.3373e+05 11 3
38 1 34 1 2 1 2 1 1 2 8702 4.3068e+06 14 2.232e+05 119 101 59 108 67 43 68 40 54743 3.9076e+05 1.9706e+05 3.8841e+05 3.3205e+05 8 4
61 1 26 1 2 2 1 2 2 1 5510 4.68e+06 15 1.1587e+05 108 63 100 51 96 86 101 39 5.7066e+05 6556 5.695e+05 2.366e+05 6.2716e+05 11 2
33 1 34 2 2 1 2 1 1 2 10654 4.3146e+06 11 1.1328e+05 42 101 85 75 64 117 125 25 6.022e+05 45436 1.7054e+05 1.9695e+05 6.4765e+05 8 2
56 1 30 2 1 1 1 2 1 2 9255 4.7529e+06 15 1.3269e+05 66 69 96 109 114 95 50 26 4.225e+05 1.0255e+06 5.3622e+05 4.5013e+05 4.8763e+05 11 1
56 1 26 1 2 2 2 1 2 2 5843 4.8526e+06 12 1.0358e+05 59 105 122 94 126 82 71 32 1.1691e+06 1.6196e+05 2.8309e+05 2.3769e+05 6.1868e+05 15 4
34 2 34 2 1 2 1 1 2 2 11611 4.2177e+06 10 1.2507e+05 60 94 63 93 70 89 43 30 1.1452e+06 8.2321e+05 45613 4.4707e+05 7.5443e+05 5 3
39 1 31 1 2 1 1 2 1 2 6227 4.5034e+06 10 1.132e+05 65 128 81 106 126 103 78 34 6.9414e+05 8.6913e+05 51949 2.618e+05 1.8862e+05 15 2
52 2 27 1 2 2 2 1 2 1 6798 3.8921e+06 12 1.5142e+05 109 67 59 93 115 128 91 25 1.1681e+06 6.9579e+05 4.0968e+05 6.3325e+05 96028 15 3
39 2 28 1 2 1 1 2 1 2 6622 3.954e+06 11 1.0631e+05 95 61 93 97 86 119 39 31 8.0619e+05 3.2302e+05 5.8428e+05 4.6342e+05 6.7062e+05 4 4
37 1 33 2 2 2 1 2 1 2 10339 4.9465e+06 14 1.6161e+05 83 108 104 110 61 69 70 28 6.8308e+05 2.3696e+05 7.0991e+05 5.5309e+05 5.633e+05 6 1
57 1 23 1 1 1 2 2 1 1 6038 4.058e+06 13 1.2777e+05 48 45 122 52 71 95 63 45 3.8451e+05 1.0123e+06 3.0109e+05 3.4506e+05 5.7412e+05 11 1
58 2 34 1 1 2 2 2 2 1 6028 3.9868e+06 12 2.1285e+05 118 96 123 39 70 71 63 42 7.6362e+05 5.9227e+05 32776 3.5979e+05 2.3785e+05 13 2
45 2 25 1 2 1 2 2 2 2 10393 3.8617e+06 13 1.561e+05 85 70 42 63 72 128 79 32 4.8811e+05 5.7915e+05 3.7296e+05 5.7336e+05 4.3716e+05 15 2
60 2 25 1 1 2 2 1 2 1 11110 4.1274e+06 10 99699 88 108 124 119 116 113 87 37 6.3287e+05 3.421e+05 5 5 5 3 4
43 1 26 2 2 2 2 1 2 2 3739 4.0081e+06 11 1.313e+05 113 51 74 115 88 102 93 31 8.506e+05 1.0112e+06 1.1333e+05 7.4276e+05 3.7876e+05 13 4
58 1 26 2 1 1 2 1 1 2 7255 4.137e+06 12 1.4287e+05 110 107 64 117 93 55 59 36 2.7242e+05 1.1865e+06 6.6025e+05 5.5564e+05 4.9857e+05 6 3
37 2 28 1 2 2 1 1 2 1 10303 4.8303e+06 13 1.9749e+05 68 81 62 93 61 99 43 26 4.5968e+05 1.1649e+06 5 5 5 9 4
40 1 31 1 2 2 1 1 1 1 7030 4.322e+06 10 2.1936e+05 104 70 61 64 118 99 62 41 8.7334e+05 9.4299e+05 7.9815e+05 7.7762e+05 3.9484e+05 11 2
44 1 31 2 2 1 1 2 1 2 6292 3.945e+06 11 1.9246e+05 51 47 69 46 77 79 42 27 3.3266e+05 6.4239e+05 2.7675e+05 1.2397e+05 1.4598e+05 15 1
36 2 34 2 1 2 1 2 2 1 11688 4.623e+06 13 1.0352e+05 52 76 107 112 123 84 120 27 7.47e+05 3.8502e+05 5 5 5 14 2
60 2 28 1 2 2 1 2 1 2 8839 4.3726e+06 12 2.1369e+05 105 128 69 64 50 77 64 23 80837 1.7681e+05 2.067e+05 1.6149e+05 1.2301e+05 12 1
46 2 35 1 2 2 1 1 1 1 3101 4.7531e+06 10 2.0641e+05 77 80 85 110 46 99 125 23 8.2155e+05 3.6444e+05 6.9843e+05 4.5276e+05 7.5819e+05 9 4
56 1 27 2 1 2 2 1 2 2 11489 4.5423e+06 11 1.8087e+05 97 96 75 86 58 104 111 31 6.8556e+05 4.4195e+05 5.7142e+05 2.2003e+05 7.3995e+05 14 4
43 2 23 1 1 1 1 1 2 1 5495 3.917e+06 13 1.6673e+05 109 99 53 46 53 92 119 30 95028 1.0561e+06 5 5 5 4 4
33 1 28 2 2 1 2 1 2 1 4151 4.3581e+06 13 1.2747e+05 110 86 97 112 108 87 128 41 8.0115e+05 3.4719e+05 5.7305e+05 82400 6.6756e+05 11 2
45 2 27 2 2 2 1 1 1 1 5463 4.864e+06 11 1.6473e+05 117 85 47 87 126 98 118 35 7.7133e+05 1.0244e+06 5 5 5 11 1
48 1 22 2 2 2 1 2 1 1 3782 4.9447e+06 11 2.2539e+05 70 100 56 45 97 126 101 33 3.2085e+05 1.3041e+05 1.9506e+05 1.178e+05 5.9851e+05 5 3
32 2 34 1 1 2 2 2 1 1 6004 4.7514e+06 13 1.8979e+05 47 114 51 82 106 93 114 31 61280 1.0363e+06 5 5 5 11 2
60 1 34 2 1 2 1 2 1 1 8189 4.3026e+06 15 2.2198e+05 106 68 124 127 101 121 39 24 47433 3.4106e+05 5 5 5 7 2
37 1 23 1 1 2 1 2 1 1 9179 4.1214e+06 14 2.202e+05 70 111 75 59 48 104 84 32 2.3284e+05 1.1435e+06 6.8463e+05 4.332e+05 2.9305e+05 15 2
61 2 32 1 1 1 2 1 2 1 10847 4.7663e+06 11 1.8663e+05 114 102 45 62 65 42 66 33 2.4288e+05 8.6783e+05 1.5275e+05 11872 60904 9 1
32 1 22 1 2 2 2 1 1 1 3153 4.1529e+06 11 2.0905e+05 69 92 42 101 107 78 112 30 5.2184e+05 96063 5 5 5 6 1
53 1 31 1 2 1 2 2 1 2 7947 4.232e+06 14 2.0219e+05 64 64 102 63 108 125 94 35 5.1068e+05 77648 5.0758e+05 4.3215e+05 8.0122e+05 6 3
60 2 35 2 2 2 2 1 1 2 12008 4.4611e+06 13 2.0546e+05 108 63 48 47 107 94 88 22 1.1194e+06 5.0542e+05 5.0953e+05 4.0541e+05 37122 14 1
56 2 28 1 2 1 1 1 2 2 3684 4.2615e+06 12 1.7455e+05 106 122 95 53 126 111 108 29 8.0124e+05 8.1616e+05 7.8447e+05 4.5272e+05 6.5642e+05 11 1
52 2 32 2 2 1 2 2 2 2 12027 4.6248e+06 13 97537 110 105 42 42 91 84 120 44 8.5482e+05 7.1576e+05 2.2808e+05 8.0474e+05 1.1887e+05 16 3
32 2 33 1 1 1 2 2 2 1 6587 4.2705e+06 14 1.6414e+05 107 39 39 83 103 52 103 45 1.0164e+06 2.2236e+05 5 5 5 15 4
59 2 34 2 1 1 1 1 2 1 11385 4.0356e+06 13 1.7191e+05 52 93 82 94 88 53 110 37 6.3856e+05 8.6483e+05 5 5 5 5 4
53 1 23 1 1 1 1 2 1 2 8258 3.9672e+06 13 1.5378e+05 109 68 122 101 97 50 118 22 5.1403e+05 6.9194e+05 6.0632e+05 6.9478e+05 7.1033e+05 15 1
57 2 33 1 1 2 2 2 1 2 4446 4.2479e+06 15 1.8708e+05 87 117 126 109 105 58 115 30 1.2008e+06 5.0868e+05 2.2304e+05 13136 2.2291e+05 12 1
57 1 28 1 2 2 1 2 2 1 6676 4.0574e+06 15 1.9255e+05 75 41 119 91 58 41 122 43 1.1383e+06 7.4242e+05 2206 3.0556e+05 7.0595e+05 7 4
33 1 22 1 1 1 2 1 1 1 3029 4.4736e+06 13 2.1772e+05 128 40 103 85 114 65 82 43 2.4359e+05 2.7308e+05 5.9221e+05 1.6906e+05 19406 7 3
39 1 30 1 2 2 1 1 2 2 11154 4.3411e+06 11 1.5561e+05 117 64 120 56 52 106 53 35 1.0737e+06 8.9172e+05 5 5 5 6 3
59 2 34 2 1 2 1 2 2 1 5328 4.1503e+06 10 1.8991e+05 92 106 111 61 75 127 48 30 9.1061e+05 4.0096e+05 6.4445e+05 4.0826e+05 7.189e+05 6 1
44 1 30 1 2 2 2 2 2 1 7864 4.7112e+06 13 1.6132e+05 40 61 88 64 120 106 49 38 3.9761e+05 1.11e+05 5 5 5 15 4
52 1 26 2 2 1 1 2 2 1 9743 4.035e+06 11 2.2298e+05 128 127 42 72 121 105 98 32 7.3395e+05 3.5132e+05 5 5 5 12 2
33 2 33 2 2 1 1 1 1 1 12094 5.018e+06 15 2.046e+05 123 127 44 78 72 125 73 42 8.7011e+05 5.5695e+05 5 5 5 11 3
49 1 25 1 2 1 2 2 2 2 7620 4.7873e+06 15 2.2232e+05 59 59 81 52 101 99 46 42 9.5866e+05 1.2532e+05 6.5882e+05 6.6202e+05 6.3334e+05 16 1
55 2 33 2 2 1 1 2 1 1 11101 4.1332e+06 10 2.1335e+05 82 115 74 78 44 116 124 40 1.8405e+05 2.3361e+05 5 5 5 8 4
43 1 35 2 1 2 2 2 1 2 9577 4.6134e+06 12 1.3704e+05 116 45 42 118 101 73 110 31 2.7724e+05 2.9084e+05 2.0099e+05 6.2885e+05 7.3481e+05 12 3
41 2 34 1 2 2 1 1 2 1 3177 4.7069e+06 14 1.1572e+05 97 122 108 103 93 111 91 41 1.1721e+06 6.5979e+05 2.4901e+05 71275 2.846e+05 14 4
57 2 29 2 1 2 1 2 2 1 3299 3.8504e+06 12 2.0457e+05 68 40 64 67 84 80 68 32 3.2526e+05 2.6951e+05 4.4772e+05 5.8144e+05 3.0829e+05 7 3
53 1 28 2 2 1 2 2 2 2 6513 4.6864e+06 14 1.5003e+05 78 126 64 80 125 95 59 43 1.0453e+06 8502 4.6291e+05 1.9345e+05 7.522e+05 8 1
39 2 23 2 2 1 1 1 2 2 5734 3.8717e+06 12 1.359e+05 114 41 41 124 106 73 74 40 7.0126e+05 9.4178e+05 5.5812e+05 2.8183e+05 4.937e+05 4 3
52 1 33 2 2 2 1 2 1 1 8538 4.0309e+06 10 2.1962e+05 50 68 84 119 75 48 76 22 6.6651e+05 2.873e+05 5.7137e+05 3.9792e+05 2.534e+05 13 4
34 2 33 1 2 2 1 1 1 2 4009 4.5113e+06 12 1.5425e+05 78 125 88 48 81 75 123 34 6.3117e+05 1.7421e+05 4.1781e+05 53563 3.9949e+05 15 3
45 2 31 2 1 1 1 1 1 1 8781 4.0651e+06 13 1.7694e+05 88 57 110 47 80 123 80 22 6.3763e+05 3.5245e+05 7.2816e+05 5.6314e+05 7.6684e+05 16 3
54 1 31 2 2 1 2 1 1 1 9446 4.5515e+06 10 1.9657e+05 83 119 125 54 122 126 62 25 1.1504e+06 4.8076e+05 7.9435e+05 4.9459e+05 6.5574e+05 16 1
39 2 30 2 1 1 1 2 2 2 3260 3.9666e+06 14 1.5681e+05 93 118 127 84 94 57 65 45 4.4213e+05 1.0431e+06 5.1035e+05 4.0744e+05 1.8772e+05 14 3
45 2 35 2 2 1 2 1 1 2 10704 3.9045e+06 11 1.41e+05 62 110 117 78 83 81 47 42 82420 7.9267e+05 2.0104e+05 6.095e+05 6.3407e+05 4 1
55 1 30 2 2 2 1 2 1 1 9369 4.2596e+06 14 1.1102e+05 115 61 39 115 40 43 68 36 1.1463e+06 1.0373e+05 6.5197e+05 44514 3.1283e+05 12 4
41 2 25 1 1 1 2 1 2 2 5628 4.685e+06 14 1.5366e+05 42 73 58 54 47 100 80 26 34963 4.8587e+05 5 5 5 11 1
52 2 26 1 2 2 1 2 1 2 5616 4.4127e+06 13 1.6661e+05 54 51 85 56 68 105 98 42 48740 3.2395e+05 7.9375e+05 1.4909e+05 6.9624e+05 15 3
56 1 25 2 2 2 1 1 1 1 3981 4.9635e+06 11 1.497e+05 93 87 118 118 92 95 128 29 4.6528e+05 1.0676e+06 3.1127e+05 5.7784e+05 24877 8 3
43 1 24 1 2 2 1 1 2 2 5162 4.6331e+06 10 2.1489e+05 85 117 60 123 122 73 59 28 5.335e+05 1.0591e+05 5 5 5 12 4
49 2 34 2 1 2 2 1 2 2 8846 4.3839e+06 14 1.0565e+05 127 115 104 59 42 73 50 22 3.2239e+05 3.5344e+05 5 5 5 6 2
55 1 28 2 1 1 1 2 1 1 5536 4.7604e+06 12 1.4804e+05 96 95 93 80 120 123 64 44 6.0918e+05 7.7703e+05 2.7812e+05 408 1.9466e+05 14 4
47 1 32 1 2 2 2 1 2 1 10592 4.6017e+06 14 1.2947e+05 41 58 120 58 41 114 104 32 55490 7.8029e+05 3.7529e+05 44343 3.6927e+05 3 3
61 1 27 2 2 2 2 2 2 2 7537 4.182e+06 12 1.2212e+05 60 71 75 90 71 117 114 32 5.4556e+05 5.4804e+05 5 5 5 12 2
33 2 22 1 2 2 2 1 2 2 3104 4.1636e+06 10 1.9498e+05 128 122 114 115 96 66 56 26 1.4483e+05 1.1125e+06 7.1682e+05 3.2588e+05 5.9883e+05 12 4
39 1 25 1 1 2 2 2 2 1 7102 4.3602e+06 14 97735 103 126 120 75 39 95 110 42 95488 6.1439e+05 5.861e+05 2.2038e+05 3.3502e+05 12 1
57 2 29 2 1 2 1 2 2 1 11022 4.0487e+06 12 1.5758e+05 43 85 61 43 112 95 72 44 3.8068e+05 8.4205e+05 5 5 5 15...
% Dealing only with two Stages - Advanced Fibrosis and Moderate Fibrosis
HCV.Baselinehistologicalstaging(HCV.Baselinehistologicalstaging<3)=0;
HCV.Baselinehistologicalstaging(HCV.Baselinehistologicalstaging>2)=1;
HCV.(i)(HCV.(i)==1)=0; %Absence
HCV.(i)(HCV.(i)==2)=1; %Presence
% Create array of the data
HCV_array=table2array(HCV);
The dataset contains two data types - categorical and numeric respectively.
%Summary Statistics of continuous/numeric variables
cont_array=HCV_array(:,[1,3,11:27])
56 35 7425 4248807 14 112132 99 84 52 109 81 5 5 5 655330 634536 288194 5 5
46 29 12101 4429425 10 129367 91 123 95 75 113 57 123 44 40620 538635 637056 336804 31085
57 33 4178 4621191 12 151522 113 49 95 107 116 5 5 5 571148 661346 5 735945 558829
49 33 6490 4794631 10 146457 43 64 109 80 88 48 77 33 1041941 449939 585688 744463 582301
59 32 3661 4606375 11 187684 99 104 67 48 120 94 90 30 660410 738756 3731527 338946 242861
58 22 11785 3882456 15 131228 66 104 121 96 65 73 114 29 1157452 1086852 5 5 5
42 26 11620 4747333 12 177261 78 57 113 118 107 84 80 28 325694 1034008 275095 214566 635157
48 30 7335 4405941 11 216176 119 112 80 127 45 96 53 39 641129 72050 787295 370605 506296
44 23 10480 4608464 12 148889 93 83 55 102 97 122 39 45 591441 757361 5 371090 203042
45 30 6681 4455329 12 98200 55 68 72 127 81 125 43 30 1151206 230488 267320 275295 555516
mean_val = mean(cont_array);
std_val= std(cont_array);
median_val = median(cont_array);
min_val = min(cont_array);
max_val = max(cont_array);
range_val = range(cont_array);
[corr_val, p_val] = corr(cont_array,HCV_array(:,29),'type','Spearman');
summary_table1 = sortrows(array2table([mean_val;std_val;median_val;min_val;max_val;range_val;corr_val';p_val']', ...
'VariableNames',{'mean','stddev','median','min','max','range','Spearman_Correlation_Coefficient','P-value'}, ...
'RowNames',HCV.Properties.VariableNames([1,3,11:27])),'Spearman_Correlation_Coefficient','descend','ComparisonMethod','abs')
summary_table1 = 19×8 table
| | mean | stddev | median | min | max | range | Spearman_Correlation_Coefficient | P-value |
|---|
| 1 BMI | 28.6087 | 4.0762 | 29 | 22 | 35 | 13 | -0.0703 | 0.0088 |
|---|
| 2 RNAEF | 2.9138e+05 | 2.6770e+05 | 244049 | 5 | 810333 | 810328 | 0.0517 | 0.0543 |
|---|
| 3 ALTafter24w | 33.4383 | 7.0736 | 34 | 5 | 45 | 40 | 0.0395 | 0.1419 |
|---|
| 4 RNA12 | 2.8875e+05 | 2.8535e+05 | 234359 | 5 | 3731527 | 3731522 | 0.0372 | 0.1660 |
|---|
| 5 RNABase | 5.9095e+05 | 3.5394e+05 | 593103 | 11 | 1201086 | 1201075 | 0.0358 | 0.1824 |
|---|
| 6 RNA4 | 6.0090e+05 | 3.6232e+05 | 597869 | 5 | 1201715 | 1201710 | -0.0319 | 0.2355 |
|---|
| 7 Age | 46.3191 | 8.7815 | 46 | 32 | 61 | 29 | -0.0243 | 0.3656 |
|---|
| 8 ALT1 | 83.9162 | 25.9228 | 83 | 39 | 128 | 89 | 0.0236 | 0.3806 |
|---|
| 9 Plat | 1.5835e+05 | 3.8795e+04 | 157916 | 93013 | 226464 | 133451 | -0.0204 | 0.4490 |
|---|
| 10 AST1 | 82.7747 | 25.9932 | 83 | 39 | 128 | 89 | -0.0140 | 0.6031 |
|---|
| 11 ALT48 | 83.6296 | 26.2240 | 83 | 5 | 128 | 123 | -0.0112 | 0.6777 |
|---|
| 12 ALT4 | 83.4058 | 26.5297 | 82 | 39 | 128 | 89 | -0.0090 | 0.7389 |
|---|
| 13 WBC | 7.5334e+03 | 2.6682e+03 | 7498 | 2991 | 12101 | 9110 | 0.0086 | 0.7480 |
|---|
| 14 ALT12 | 83.5105 | 26.0645 | 84 | 39 | 128 | 89 | 0.0048 | 0.8595 |
|---|
| 15 ALT24 | 83.7090 | 26.2060 | 83 | 39 | 128 | 89 | -0.0043 | 0.8728 |
|---|
| 16 RNAEOT | 2.8766e+05 | 2.6456e+05 | 251376 | 5 | 808450 | 808445 | -0.0037 | 0.8918 |
|---|
| 17 ALT36 | 83.1177 | 26.3990 | 84 | 5 | 128 | 123 | 0.0037 | 0.8920 |
|---|
| 18 RBC | 4.4221e+06 | 3.4636e+05 | 4438465 | 3816422 | 5018451 | 1202029 | -0.0032 | 0.9047 |
|---|
| 19 HGB | 12.5877 | 1.7135 | 13 | 10 | 15 | 5 | 0.0002 | 0.9946 |
|---|
% Top 8 rank correlated numeric features
head(summary_table1)
ans = 8×8 table
| | mean | stddev | median | min | max | range | Spearman_Correlation_Coefficient | P-value |
|---|
| 1 BMI | 28.6087 | 4.0762 | 29 | 22 | 35 | 13 | -0.0703 | 0.0088 |
|---|
| 2 RNAEF | 2.9138e+05 | 2.6770e+05 | 244049 | 5 | 810333 | 810328 | 0.0517 | 0.0543 |
|---|
| 3 ALTafter24w | 33.4383 | 7.0736 | 34 | 5 | 45 | 40 | 0.0395 | 0.1419 |
|---|
| 4 RNA12 | 2.8875e+05 | 2.8535e+05 | 234359 | 5 | 3731527 | 3731522 | 0.0372 | 0.1660 |
|---|
| 5 RNABase | 5.9095e+05 | 3.5394e+05 | 593103 | 11 | 1201086 | 1201075 | 0.0358 | 0.1824 |
|---|
| 6 RNA4 | 6.0090e+05 | 3.6232e+05 | 597869 | 5 | 1201715 | 1201710 | -0.0319 | 0.2355 |
|---|
| 7 Age | 46.3191 | 8.7815 | 46 | 32 | 61 | 29 | -0.0243 | 0.3656 |
|---|
| 8 ALT1 | 83.9162 | 25.9228 | 83 | 39 | 128 | 89 | 0.0236 | 0.3806 |
|---|
% Summary Statistics for categorical variables
categorical_dat=HCV(:,[2,4:10]);
cat = HCV_array(:,[2,4:10]);
prevalent_val = mode(cat);
count_val = countcats(categorical(cat),1);
perc_val = count_val/length(cat);
[~,chi2(i),p(i)] = crosstab(categorical_dat.(i),HCV.Baselinehistologicalstaging);
summary_table2 = sortrows(array2table([prevalent_val',count_val',perc_val'*100,chi2,p],"RowNames",HCV.Properties.VariableNames([2,4:10]),'VariableNames',{'Mode','Count_Absent','Count_Present','Percent_Absent','Precent_Present','Chi_Square-statistic','P-value'}),'P-value')
summary_table2 = 8×7 table
| | Mode | Count_Absent | Count_Present | Percent_Absent | Precent_Present | Chi_Square-statistic | P-value |
|---|
| 1 NauseaVomting | 1 | 689 | 696 | 49.7473 | 50.2527 | 4.9507 | 0.0261 |
|---|
| 2 Epigastricpain | 1 | 687 | 698 | 49.6029 | 50.3971 | 3.0912 | 0.0787 |
|---|
| 3 Gender | 0 | 707 | 678 | 51.0469 | 48.9531 | 2.2702 | 0.1319 |
|---|
| 4 Fever | 1 | 671 | 714 | 48.4477 | 51.5523 | 0.5090 | 0.4756 |
|---|
| 5 Fatiguegeneralizedboneache | 0 | 694 | 691 | 50.1083 | 49.8917 | 0.0061 | 0.9380 |
|---|
| 6 Diarrhea | 1 | 689 | 696 | 49.7473 | 50.2527 | 0.0055 | 0.9410 |
|---|
| 7 Headache | 0 | 698 | 687 | 50.3971 | 49.6029 | 0.0049 | 0.9440 |
|---|
| 8 Jaundice | 1 | 691 | 694 | 49.8917 | 50.1083 | 0.0009 | 0.9763 |
|---|
DATA VISUALIZATION
Binary Data
h=heatmap(HCV,'Baselinehistologicalstaging',HCV.Properties.VariableNames(i));
h=heatmap(HCV,'BaselinehistologicalGrading','Baselinehistologicalstaging');
Continuous Data
histogram(HCV.(i)(HCV.(29)==0))
histogram(HCV.(i)(HCV.(29)==1))
xlabel(HCV.Properties.VariableNames(i));
legend({'Moderate Stage','Advanced Stage'})
Biological Significance
% Observe RNA levels and ALT levels for moderate and advanced fibrosis
moderateFib=HCV(HCV.Baselinehistologicalstaging==0,:);
advancedFib=HCV(HCV.Baselinehistologicalstaging==1,:);
mod=table2array(moderateFib);
adv=table2array(advancedFib);
plot([0,4,12,18],mean(mod(:,23:26)))
ylabel('RNA Levels throughout the treatment','LineWidth',2,'LineStyle',"--")
xlabel('Treatment duration(months)')
plot([0,4,12,18],mean(adv(:,23:26)),'Color','r','LineWidth',2)
legend('Moderate Fibrosis','Advanced Fibrosis')
%230th individual ideal case
plot([1,4,12,24,36,48]/12,mean(mod(:,16:21)),'LineWidth',2,'LineStyle',"--" )
ylabel('ALT Levels throughout the treatment')
xlabel('Initial Treatment duration (months)')
plot([1,4,12,24,36,48]/12,mean(adv(:,16:21)),'Color','r','LineWidth',2)
legend('Moderate Fibrosis','Advanced Fibrosis')
%445th individual ideal case
We see that there is a drastic difference between the ALT levels of individuals with moderate fibrosis and those with advanced fibrosis while the RNA levels are close by. The surprising thing is that for both the groups the ALT levels fall lowest at the 3rd month and start rising up again. I believe the ALT levels keep fluctuating and slowly come to a constant by the end of treatment which is the 18th month.
Min-max normalization of data
colmin = min(X); colmax = max(X);
X = rescale(X, 'InputMin', colmin, 'InputMax', colmax);
HCV_new=array2table([X,Y],"VariableNames",HCV.Properties.VariableNames)
HCV_new = 1385×29 table
| | Age | Gender | BMI | Fever | NauseaVomting | Headache | Diarrhea | Fatiguegeneralizedboneache | Jaundice | Epigastricpain | WBC | RBC | HGB | Plat | AST1 | ALT1 | ALT4 | ALT12 | ALT24 | ALT36 | ALT48 | ALTafter24w | RNABase | RNA4 | RNA12 | RNAEOT | RNAEF | BaselinehistologicalGrading | Baselinehistologicalstaging |
|---|
| 1 | 0.8276 | 0 | 1.0000 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0.4867 | 0.3597 | 0.8000 | 0.1433 | 0.6742 | 0.5056 | 0.1461 | 0.7865 | 0.4719 | 0 | 0 | 0 | 0.5456 | 0.5280 | 0.0772 | 0 | 0 | 0.7692 | 0 |
|---|
| 2 | 0.4828 | 0 | 0.5385 | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 1.0000 | 0.5100 | 0 | 0.2724 | 0.5843 | 0.9438 | 0.6292 | 0.4045 | 0.8315 | 0.4228 | 0.9593 | 0.9750 | 0.0338 | 0.4482 | 0.1707 | 0.4166 | 0.0384 | 0.0769 | 0 |
|---|
| 3 | 0.8621 | 0 | 0.8462 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0.1303 | 0.6695 | 0.4000 | 0.4384 | 0.8315 | 0.1124 | 0.6292 | 0.7640 | 0.8652 | 0 | 0 | 0 | 0.4755 | 0.5503 | 0 | 0.9103 | 0.6896 | 0.0769 | 1 |
|---|
| 4 | 0.5862 | 1 | 0.8462 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0.3841 | 0.8138 | 0 | 0.4005 | 0.0449 | 0.2809 | 0.7865 | 0.4607 | 0.5506 | 0.3496 | 0.5854 | 0.7000 | 0.8675 | 0.3744 | 0.1570 | 0.9209 | 0.7186 | 0.5385 | 1 |
|---|
| 5 | 0.9310 | 0 | 0.7692 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0.0735 | 0.6572 | 0.2000 | 0.7094 | 0.6742 | 0.7303 | 0.3146 | 0.1011 | 0.9101 | 0.7236 | 0.6911 | 0.6250 | 0.5498 | 0.6147 | 1.0000 | 0.4193 | 0.2997 | 0.6154 | 0 |
|---|
| 6 | 0.8966 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0.9653 | 0.0549 | 1.0000 | 0.2864 | 0.3034 | 0.7303 | 0.9213 | 0.6404 | 0.2921 | 0.5528 | 0.8862 | 0.6000 | 0.9637 | 0.9044 | 0 | 0 | 0 | 0.0769 | 1 |
|---|
| 7 | 0.3448 | 1 | 0.3077 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0.9472 | 0.7744 | 0.4000 | 0.6313 | 0.4382 | 0.2022 | 0.8315 | 0.8876 | 0.7640 | 0.6423 | 0.6098 | 0.5750 | 0.2712 | 0.8604 | 0.0737 | 0.2654 | 0.7838 | 0.6923 | 1 |
|---|
| 8 | 0.5517 | 1 | 0.6154 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0.4768 | 0.4904 | 0.2000 | 0.9229 | 0.8989 | 0.8202 | 0.4607 | 0.9888 | 0.0674 | 0.7398 | 0.3902 | 0.8500 | 0.5338 | 0.0600 | 0.2110 | 0.4584 | 0.6248 | 0.6923 | 1 |
|---|
| 9 | 0.4138 | 0 | 0.0769 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0.8221 | 0.6589 | 0.4000 | 0.4187 | 0.6067 | 0.4944 | 0.1798 | 0.7079 | 0.6517 | 0.9512 | 0.2764 | 1.0000 | 0.4924 | 0.6302 | 0 | 0.4590 | 0.2506 | 0.1538 | 0 |
|---|
| 10 | 0.4483 | 0 | 0.6154 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0.4050 | 0.5315 | 0.4000 | 0.0389 | 0.1798 | 0.3258 | 0.3708 | 0.9888 | 0.4719 | 0.9756 | 0.3089 | 0.6250 | 0.9585 | 0.1918 | 0.0716 | 0.3405 | 0.6855 | 0.0769 | 0 |
|---|
| 11 | 0.1724 | 1 | 0.1538 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0.1587 | 0.3732 | 0.4000 | 0.5471 | 0.7191 | 0.9551 | 0.8090 | 0.3933 | 0.1573 | 0.9593 | 0.7805 | 0.7000 | 0.8518 | 0.0859 | 0.1961 | 0.5547 | 0.0740 | 0.9231 | 0 |
|---|
| 12 | 0.1379 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0.3360 | 0.2611 | 0.6000 | 0.3841 | 0.4045 | 0.1124 | 0.6067 | 0.1461 | 0.0787 | 0.3333 | 0.4390 | 1.0000 | 0.1146 | 0.9345 | 0.1505 | 0.0781 | 0.9949 | 1.0000 | 0 |
|---|
| 13 | 0.4483 | 1 | 0.2308 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0.6902 | 0.2500 | 0.6000 | 0.8242 | 0.6517 | 0.6966 | 0.3034 | 0.1573 | 0.6292 | 0.4065 | 0.8049 | 0.5250 | 0.7797 | 0.4468 | 0 | 0 | 0 | 0.3846 | 0 |
|---|
| 14 | 0.0690 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0.2906 | 0.4203 | 0.8000 | 0.3604 | 0.9101 | 0.2472 | 0.2809 | 0.1348 | 0.4382 | 0.6911 | 0.8780 | 0.4500 | 0.3272 | 0.7359 | 0.1573 | 0.2261 | 0.9652 | 0.4615 | 0 |
|---|
| 15 | 0.2759 | 1 | 0.7692 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0.9348 | 0.2905 | 0.8000 | 0.9731 | 0.9888 | 0.9326 | 0.7528 | 0.7416 | 0.5506 | 0.8618 | 0.8618 | 0.7750 | 0.9439 | 0.9252 | 0.1129 | 0.5412 | 0.1538 | 0.3846 | 0 |
|---|
| 16 | 0.8966 | 0 | 0.9231 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0.5537 | 0.8985 | 0.2000 | 0.4233 | 0.8764 | 0.1573 | 0.1236 | 0.4607 | 0.9101 | 0.4959 | 0.6585 | 0.7250 | 0.5120 | 0.2615 | 0.0224 | 0.8306 | 0.1668 | 0.9231 | 0 |
|---|
| 17 | 1.0000 | 0 | 1.0000 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0.8619 | 0.2902 | 0 | 0.7840 | 0.5281 | 0.7416 | 0.3483 | 0.5281 | 0.4944 | 0.6667 | 0.3415 | 0.7000 | 0.7494 | 0.6004 | 0 | 0 | 0 | 0 | 1 |
|---|
| 18 | 0.7931 | 1 | 0.1538 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0.6021 | 0.5411 | 0.8000 | 0.5265 | 0.1573 | 0.6966 | 0.1236 | 0.6292 | 0.8202 | 0.7480 | 0.5122 | 0.5500 | 0.9536 | 0.1922 | 0.1227 | 0.3938 | 0.3164 | 0.0769 | 1 |
|---|
| 19 | 0.8276 | 0 | 0.3846 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0.3960 | 0.5258 | 1.0000 | 0.7316 | 0.1573 | 0.9551 | 0.2584 | 0.4157 | 0.2022 | 0.3333 | 0.7154 | 0.5250 | 0.4219 | 0.2989 | 0.1992 | 0.5012 | 0.2011 | 0.2308 | 1 |
|---|
| 20 | 0.1034 | 1 | 0.0769 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0.2035 | 0.5155 | 0 | 0.1409 | 0.8539 | 0.9213 | 0.2697 | 0.9888 | 0.6292 | 0.9675 | 0.7154 | 0.9250 | 0.8996 | 0.0636 | 0.1922 | 0.5001 | 0.5895 | 1.0000 | 1 |
|---|
| 21 | 0.8621 | 1 | 0.0769 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0.3221 | 0.1790 | 1.0000 | 0.1764 | 0.5281 | 0.7865 | 0.8876 | 0.8989 | 0.1798 | 0.7967 | 0.6423 | 0.6750 | 0.1412 | 0.6541 | 0.1793 | 0.6570 | 0.3486 | 0.2308 | 0 |
|---|
| 22 | 0.0345 | 0 | 0.2308 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0.7641 | 0.9803 | 0 | 0.1200 | 0.5056 | 0.4270 | 0.3146 | 0.4719 | 0.8764 | 0.5122 | 0.3008 | 0.6750 | 0.9451 | 0.4766 | 0 | 0 | 0 | 0.0769 | 0 |
|---|
| 23 | 0.3103 | 0 | 0.0769 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0.5456 | 0.6481 | 0.8000 | 0.0129 | 0.0674 | 0.5955 | 0.7191 | 0.7303 | 0.0112 | 0.8943 | 0.7154 | 0.7000 | 0.2443 | 0.3666 | 0.0142 | 0.0226 | 0.2312 | 0.5385 | 1 |
|---|
| 24 | 0.2414 | 1 | 0.5385 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0.4550 | 0.6729 | 0 | 0.8868 | 0.3483 | 0.7079 | 0.4157 | 0.2135 | 0.8090 | 0.7317 | 0.4309 | 0.5000 | 0.8275 | 0.8260 | 0.0259 | 0.4142 | 0.9413 | 0.9231 | 1 |
|---|
| 25 | 0.0345 | 1 | 0.1538 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0.3366 | 0.4029 | 0.2000 | 0.9676 | 0.2584 | 0.5843 | 0.8652 | 1.0000 | 0.0225 | 0.5285 | 0.8211 | 0.9500 | 0.2027 | 0.8166 | 0.0033 | 0.4453 | 0.9297 | 0.2308 | 1 |
|---|
| 26 | 0.3793 | 1 | 0.9231 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0.4014 | 0.5931 | 1.0000 | 0.1263 | 0.1011 | 0.8202 | 0.6742 | 0.5169 | 0.2247 | 0.6667 | 0.5935 | 0.7500 | 0.7954 | 0.4499 | 0 | 0 | 0 | 0.4615 | 1 |
|---|
| 27 | 0.6552 | 0 | 0.9231 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0.8827 | 0.1965 | 1.0000 | 0.0112 | 0.0225 | 0.1685 | 1.0000 | 0.2809 | 0.3596 | 0.6829 | 0.6667 | 0.7250 | 0.6380 | 0.4421 | 0.1977 | 0.9090 | 0.4601 | 0.1538 | 0 |
|---|
| 28 | 0.2414 | 1 | 0.8462 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0.2462 | 0.9066 | 0.4000 | 0.7291 | 0.2472 | 0.9101 | 0.8315 | 0.4045 | 0.5506 | 0.8862 | 0.7642 | 0.9500 | 0.4050 | 0.0383 | 0.0122 | 0.9070 | 0.0241 | 0.9231 | 0 |
|---|
| 29 | 0.8621 | 1 | 0.3077 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0.3345 | 0.7877 | 0.6000 | 0.2526 | 0.1348 | 0.8876 | 0.6629 | 0.0337 | 0.6067 | 0.3902 | 0.6341 | 1.0000 | 0.2376 | 0.1553 | 0 | 0 | 0 | 0.4615 | 1 |
|---|
| 30 | 0.5172 | 1 | 0.5385 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0.3134 | 0.7796 | 1.0000 | 0.0878 | 0.9101 | 0.3708 | 0.8764 | 0.9775 | 0.0674 | 0.7317 | 0.3577 | 0.8250 | 0.3548 | 0.2062 | 0.2055 | 0.4665 | 0.3889 | 0.4615 | 0 |
|---|
| 31 | 0.7931 | 1 | 0.8462 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0.2626 | 0.1522 | 1.0000 | 0.6688 | 0.6404 | 0.1124 | 0.2247 | 0.5506 | 0.2584 | 0.4309 | 0.6179 | 0.9000 | 0.9944 | 0.7728 | 0.0080 | 0.1537 | 0.3012 | 0.3077 | 0 |
|---|
| 32 | 0.8966 | 1 | 1.0000 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0.4816 | 0.1518 | 0 | 0.8100 | 0.2022 | 0.7978 | 1.0000 | 0.6404 | 0.3371 | 0.8130 | 0.5447 | 0.5250 | 0.4643 | 0.2394 | 0.1671 | 0.0827 | 0.0432 | 0.1538 | 0 |
|---|
| 33 | 0.5172 | 1 | 0.2308 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0.4934 | 0.6515 | 0.2000 | 0.5571 | 0.6180 | 0.2809 | 0.1685 | 0.9326 | 0.2809 | 0.4797 | 0.7398 | 0.4750 | 0.5029 | 0.3464 | 0.0867 | 0.8865 | 0.8374 | 0.3846 | 0 |
|---|
| 34 | 1.0000 | 0 | 0.8462 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0.9637 | 0.6362 | 0.6000 | 0.2445 | 0.0337 | 0.0899 | 0.4831 | 0.7079 | 0.1011 | 0.5772 | 0.3902 | 0.7250 | 0.9657 | 0.2650 | 0.1241 | 0.4717 | 0.3491 | 0.5385 | 0 |
|---|
| 35 | 0.1724 | 0 | 0.3846 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0.3787 | 0.2155 | 0.2000 | 0.1928 | 0.0337 | 0.8876 | 0.3146 | 0.8090 | 0.1011 | 0.8293 | 0.7805 | 1.0000 | 0.2270 | 0.0762 | 0.1084 | 0.8338 | 0.2993 | 0.6154 | 1 |
|---|
| 36 | 0.3103 | 0 | 0.5385 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0.8027 | 0.2797 | 0.8000 | 0.2083 | 1.0000 | 0.7079 | 0.4494 | 0.2697 | 0.4607 | 0.6585 | 0.9919 | 0.8750 | 0.9701 | 0.0429 | 0.0984 | 0.7273 | 0.9210 | 1.0000 | 1 |
|---|
| 37 | 0.9655 | 1 | 0.7692 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0.4801 | 0.1720 | 0.8000 | 0.9701 | 0.1461 | 0.9775 | 0.3146 | 0.9775 | 0.9775 | 0.2927 | 0.3984 | 0.7500 | 0.0936 | 0.4070 | 0.1237 | 0.4156 | 0.3545 | 0.1538 | 0 |
|---|
| 38 | 0.7586 | 0 | 0.5385 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0.8467 | 0.9111 | 0 | 0.5898 | 0.0112 | 0.0449 | 0.0787 | 0.2809 | 0.6966 | 0.3252 | 0.6992 | 0.9500 | 0.0393 | 0.4835 | 0.2116 | 0.3252 | 0.1468 | 0.9231 | 0 |
|---|
| 39 | 0.2759 | 1 | 0.4615 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0.0020 | 0.4474 | 0.8000 | 0.0194 | 0.3371 | 0.2135 | 0.2584 | 0.1236 | 0.2360 | 0.6423 | 0.8862 | 0.6500 | 0.8004 | 0.0592 | 0.0076 | 0.0384 | 0.0017 | 0.4615 | 0 |
|---|
| 40 | 0 | 0 | 0.6923 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0.7645 | 0.1024 | 0.8000 | 0.7750 | 0.4382 | 0.4719 | 0.5730 | 0.1011 | 0.3258 | 0.6341 | 1.0000 | 0.8500 | 0.7119 | 0.0853 | 0.1092 | 0.2721 | 0.5004 | 0.3077 | 1 |
|---|
| 41 | 0.8966 | 1 | 0.8462 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 0.3991 | 0.0609 | 0.8000 | 0.6735 | 0.7528 | 0.3371 | 0.9888 | 0.6742 | 0.0899 | 0.7967 | 0.8618 | 0.7000 | 0.8722 | 0.2666 | 0.0936 | 0.6764 | 0.7947 | 0.7692 | 0 |
|---|
| 42 | 0.1724 | 1 | 0.0769 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0.8125 | 0.4095 | 0.2000 | 0.6826 | 0.6067 | 0.1910 | 0.0112 | 0.9551 | 0.6966 | 0.3659 | 0.6911 | 0.7000 | 0.2259 | 0.1717 | 0.0405 | 0.3806 | 0.2154 | 0.2308 | 0 |
|---|
| 43 | 0.8966 | 0 | 0.0769 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0.7953 | 0.8165 | 0.8000 | 0.0637 | 0.9888 | 0.5955 | 0.6180 | 0.8315 | 0.6404 | 0.9837 | 0.8374 | 0.8500 | 0.2269 | 0.8831 | 0.0619 | 0.2499 | 0.3626 | 0.8462 | 1 |
|---|
| 44 | 0.1379 | 0 | 0.0769 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0.1532 | 0.7649 | 0.8000 | 0.6344 | 0.4607 | 0.1011 | 0.4157 | 0.9101 | 0.4831 | 0.8618 | 0.7805 | 0.9500 | 0.4948 | 0.9627 | 0.1170 | 0.9008 | 0.8457 | 0.1538 | 1 |
|---|
| 45 | 0.5172 | 1 | 1.0000 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0.9806 | 0.0716 | 0.8000 | 0.0358 | 0.5506 | 0.5618 | 0.0899 | 0.1011 | 0.2697 | 0.4228 | 0.8862 | 0.5250 | 0.5426 | 0.3518 | 0.1104 | 0.1132 | 0.4645 | 0.6923 | 1 |
|---|
| 46 | 0.6207 | 0 | 0.8462 | 1 | 1 | 0 | 0 | 1 | 0 | 1 | 0.7847 | 0.6273 | 0 | 0.8753 | 0.7416 | 0.8652 | 0.7528 | 0.8090 | 0.5056 | 0.8293 | 0.5366 | 0.7500 | 0.0482 | 0.7221 | 0.2033 | 0.3954 | 0.1972 | 0.7692 | 0 |
|---|
| 47 | 0.4138 | 0 | 0.6923 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0.0526 | 0.3295 | 0.4000 | 0.4296 | 0.8427 | 0.3258 | 0.9888 | 0.0899 | 0.8764 | 1.0000 | 0.3902 | 0.6750 | 0.6253 | 0.6870 | 0.0954 | 0.1249 | 0.7826 | 0.6154 | 1 |
|---|
| 48 | 0.3793 | 0 | 0.8462 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0.2666 | 0.3873 | 0 | 0.7042 | 0.8315 | 0.0112 | 0.0337 | 0.0337 | 0.4382 | 0.8211 | 0.4228 | 0.5750 | 0.6165 | 0.9964 | 0.0484 | 0.6488 | 0.3601 | 0.9231 | 0 |
|---|
| 49 | 0.7586 | 0 | 0.8462 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0.4360 | 0.9617 | 0 | 0.7223 | 0.9663 | 0.1011 | 0.8876 | 0.2247 | 0.3483 | 0.8130 | 0.8537 | 0.5000 | 0.7090 | 0.5821 | 0.1818 | 0.9472 | 0.6743 | 0.4615 | 1 |
|---|
| 50 | 0.9310 | 1 | 0.3077 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3576 | 0.4253 | 0.4000 | 0.3378 | 0.6854 | 0.7865 | 0.9438 | 0.5169 | 0.6292 | 0.6911 | 0.5447 | 0.5500 | 0.5712 | 0.0412 | 0.0165 | 0.8715 | 0.3002 | 0.9231 | 0 |
|---|
| 51 | 0.0345 | 1 | 0.6923 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0.2308 | 0.7181 | 1.0000 | 0.2269 | 0.5056 | 0.3596 | 0.2584 | 0.8764 | 0.4831 | 0.9024 | 0.8130 | 0.9250 | 0.8103 | 0.8311 | 0.2026 | 0.2165 | 0.2981 | 0.3846 | 1 |
|---|
| 52 | 0.8276 | 1 | 0.0769 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0.1982 | 0.1978 | 0.8000 | 0.6211 | 0.3483 | 0.2135 | 0.6854 | 0.8202 | 0.4607 | 0.5772 | 0.7805 | 0.9500 | 0.4237 | 0.1088 | 0.0116 | 0.3718 | 0.1137 | 0.7692 | 0 |
|---|
| 53 | 0.3103 | 0 | 0.8462 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0.2250 | 0.2718 | 0.6000 | 0.2088 | 0.8539 | 0.6067 | 0.7865 | 0.1011 | 0.1910 | 0.5772 | 0.7724 | 0.9500 | 0.0991 | 0.3048 | 0.0036 | 0.7114 | 0.4092 | 0.3846 | 0 |
|---|
| 54 | 0.9310 | 0 | 0.7692 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0.4292 | 0.5036 | 0.4000 | 0.8937 | 0.5056 | 0.0225 | 0.5169 | 0.6517 | 0.0562 | 0.6829 | 0.9756 | 0.6750 | 0.2113 | 0.2146 | 0.1025 | 0.8321 | 0.4753 | 0.1538 | 0 |
|---|
| 55 | 0.5172 | 0 | 0.3846 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0.4682 | 0.1025 | 0.8000 | 0.7852 | 0.1798 | 0.5056 | 0.9551 | 0.2247 | 0.8315 | 0.3171 | 0.5041 | 0.9750 | 0.4984 | 0.4777 | 0.0954 | 0.0971 | 0.7492 | 0.7692 | 0 |
|---|
| 56 | 0.6207 | 1 | 0.9231 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0.5739 | 0.1556 | 0.8000 | 0.9343 | 0.1124 | 0.1685 | 0.9888 | 0.2360 | 0 | 0.8537 | 0.3333 | 0.4500 | 0.1246 | 0.1736 | 0.0714 | 0.0664 | 0.5940 | 0.6154 | 0 |
|---|
| 57 | 0.2414 | 1 | 0.6154 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0.1566 | 0.6947 | 0 | 0.6819 | 0.4831 | 0.4045 | 0.1573 | 0.4045 | 0.6292 | 0.7398 | 0.3171 | 0.6750 | 0.1937 | 0.7325 | 0.0469 | 0.2242 | 0.2272 | 0.8462 | 1 |
|---|
| 58 | 0.5517 | 0 | 0.8462 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0.3696 | 0.2768 | 1.0000 | 0.2095 | 0.0112 | 0.3146 | 0.0112 | 0.9326 | 0.3596 | 0.9106 | 0.8618 | 0.6250 | 0.2919 | 0.1252 | 0.1568 | 0.0078 | 0.6631 | 0.7692 | 1 |
|---|
| 59 | 0 | 1 | 0.3846 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0.6233 | 0.6654 | 0.2000 | 0.3420 | 0.4045 | 0.0787 | 0.1910 | 0.9101 | 0.5056 | 0.6992 | 0.4146 | 0.4500 | 0.4187 | 0.4975 | 0.0910 | 0.1447 | 0.9798 | 0.7692 | 0 |
|---|
| 60 | 0.0345 | 1 | 0.1538 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0.7074 | 0.2495 | 0.6000 | 0.2797 | 0.4494 | 0.2697 | 0.7978 | 0.1685 | 0.0225 | 0.7886 | 0.2927 | 0.5500 | 0.8414 | 0.9588 | 0.0250 | 0.7910 | 0.0785 | 0 | 0 |
|---|
| 61 | 0.6552 | 1 | 0.3077 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0.8950 | 0.8549 | 0 | 0.5402 | 0.3820 | 0.3820 | 0.2360 | 0.4831 | 0.7640 | 0.6016 | 0.3008 | 0.6750 | 0.1601 | 0.3896 | 0.0261 | 0.3329 | 0.2017 | 0.3846 | 1 |
|---|
| 62 | 0.6207 | 1 | 0.0769 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 0.2271 | 0.5141 | 0.6000 | 0.3390 | 0.4382 | 0.2697 | 0.2135 | 0.7416 | 0.8315 | 0.4228 | 0.6829 | 0.6000 | 0.3042 | 0.1992 | 0.2025 | 0.3590 | 0.7945 | 0.6923 | 1 |
|---|
| 63 | 0.3448 | 1 | 0.0769 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0.5241 | 0.1211 | 0.6000 | 0.2444 | 0.7865 | 0.5618 | 0.0337 | 0.2360 | 0.0337 | 0.5447 | 0.4065 | 0.5250 | 0.5163 | 0.4533 | 0.1985 | 0.5795 | 0.2571 | 0.6923 | 0 |
|---|
| 64 | 0.5517 | 1 | 0.8462 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0.8659 | 0.2341 | 1.0000 | 0.5304 | 0.7303 | 0.9775 | 0.7640 | 0.3933 | 0.7191 | 0.7480 | 0.8862 | 0.8250 | 0.0882 | 0.1798 | 0.0796 | 0.9347 | 0.9626 | 0.4615 | 0 |
|---|
| 65 | 0.4483 | 0 | 0.6923 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 0.9329 | 0.0521 | 1.0000 | 0.6354 | 0.9213 | 0.2921 | 0.3596 | 0.0562 | 0.6854 | 1.0000 | 0.8537 | 0.5500 | 0.3504 | 0.9234 | 0.1757 | 0.9319 | 0.6844 | 0.4615 | 0 |
|---|
| 66 | 0.8966 | 0 | 0.4615 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0.1198 | 0.7609 | 0.6000 | 0.9485 | 0.6629 | 0.5281 | 0.1573 | 0.8876 | 1.0000 | 0.7967 | 0.9268 | 0.5250 | 0.1458 | 0.0573 | 0.1695 | 0.2569 | 0.3195 | 0.5385 | 1 |
|---|
| 67 | 0.1379 | 1 | 0.6923 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0.2291 | 0.6986 | 1.0000 | 0.7945 | 0.5618 | 0.6629 | 0.0562 | 0.2247 | 0.9551 | 0.2927 | 0.3252 | 0.5750 | 0.9806 | 0.5943 | 0.0249 | 0.3886 | 0.1268 | 0.1538 | 0 |
|---|
| 68 | 0.6207 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 0.1744 | 0.5596 | 0.8000 | 0.5806 | 0.3483 | 0.7191 | 0.1461 | 0.5730 | 0.8989 | 0.2927 | 0.4228 | 0.6000 | 0.4914 | 0.2418 | 0.0688 | 0.1171 | 0.6714 | 0.8462 | 1 |
|---|
| 69 | 0.2759 | 1 | 0.0769 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0.5480 | 0.8783 | 0 | 0.9333 | 0.7528 | 0.9888 | 0.6180 | 0.5618 | 0.6629 | 0.6098 | 0.4553 | 0.7500 | 0.6060 | 0.3876 | 0.1026 | 0.9592 | 0.0302 | 0.0769 | 0 |
|---|
| 70 | 0.1724 | 0 | 0.3846 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0.2754 | 0.1342 | 0.8000 | 0.8278 | 0.8989 | 0.0562 | 0.2022 | 0.8202 | 0.7865 | 0.8049 | 0.4146 | 0.7000 | 0.7799 | 0.6137 | 0.0609 | 0.5073 | 0.7858 | 0.9231 | 1 |
|---|
| 71 | 0.2414 | 0 | 1.0000 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0.1059 | 0.4158 | 1.0000 | 0.1443 | 0.7640 | 0.9888 | 0.6629 | 0.8652 | 0.5843 | 0.8862 | 0.5772 | 0.9000 | 0.5269 | 0.3746 | 0.0987 | 0.4659 | 0.8688 | 0.1538 | 0 |
|---|
| 72 | 0.7586 | 1 | 0.3846 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0.7180 | 0.6420 | 1.0000 | 0.5212 | 0.2022 | 0.4382 | 0.8652 | 0.8427 | 0.0112 | 0.4146 | 0.7317 | 0.5000 | 0.5586 | 0.9642 | 0.1893 | 0.3216 | 0.2351 | 0.6923 | 0 |
|---|
| 73 | 0.3793 | 1 | 0.0769 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0.0619 | 0.6537 | 0.6000 | 0.5012 | 0.5955 | 0.2472 | 0.3034 | 0.4607 | 0.4944 | 0.6098 | 0.8211 | 0.7750 | 0.2728 | 0.7722 | 0.1487 | 0.4640 | 0.1944 | 0.0769 | 1 |
|---|
| 74 | 1.0000 | 0 | 0.2308 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0.1454 | 0.5514 | 0.4000 | 0.4385 | 0.3596 | 0.8315 | 0.4944 | 0.6180 | 0.0674 | 0.3577 | 1.0000 | 0.6250 | 0.8667 | 0.0830 | 0.2111 | 0.5116 | 0.3912 | 0.2308 | 1 |
|---|
| 75 | 0.4138 | 1 | 0.4615 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0.4450 | 0.6702 | 0.8000 | 0.0618 | 0.6292 | 0.9888 | 0.2022 | 0.9551 | 0.5393 | 0.9350 | 0.5691 | 0.8750 | 0.2213 | 0.0007 | 0.1394 | 0.5987 | 0.5632 | 0.3077 | 1 |
|---|
| 76 | 0.9310 | 0 | 0.2308 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0.5432 | 0.4275 | 1.0000 | 0.4390 | 0 | 0.6966 | 0.5843 | 0.8315 | 0.6292 | 0.8537 | 0.3171 | 0.6750 | 0.7676 | 0.9544 | 0.0056 | 0.1685 | 0.2125 | 0.7692 | 0 |
|---|
| 77 | 0 | 1 | 0.1538 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0.9883 | 0.3147 | 1.0000 | 0.5987 | 0.4494 | 0.2584 | 0.7303 | 0.2135 | 0.6180 | 0.8943 | 0.6829 | 0.4250 | 0.1404 | 0.1142 | 0.1475 | 0.3731 | 0.2895 | 0.3846 | 1 |
|---|
| 78 | 0 | 0 | 0.3077 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0.6453 | 0.8337 | 0.8000 | 0.3198 | 0.6966 | 0.2697 | 0.5506 | 0.3820 | 0.9101 | 0.2764 | 0.7886 | 0.4750 | 0.9279 | 0.4313 | 0.1639 | 0.5642 | 0.4970 | 0.9231 | 0 |
|---|
| 79 | 0.0690 | 0 | 0.6923 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0.1382 | 0.5442 | 0.2000 | 0.9565 | 0.4944 | 0.1798 | 0.9888 | 0.1798 | 0.0787 | 0.8537 | 0.9431 | 1.0000 | 0.1420 | 0.9058 | 0.0195 | 0.9834 | 0.5352 | 0.6154 | 1 |
|---|
| 80 | 0.2069 | 0 | 0.9231 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0.6269 | 0.4080 | 0.8000 | 0.9755 | 0.8989 | 0.6966 | 0.2247 | 0.7753 | 0.3146 | 0.3089 | 0.5122 | 0.8750 | 0.0456 | 0.3252 | 0.0528 | 0.4804 | 0.4098 | 0.3846 | 1 |
|---|
| 81 | 1.0000 | 0 | 0.3077 | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0.2765 | 0.7185 | 1.0000 | 0.1713 | 0.7753 | 0.2697 | 0.6854 | 0.1348 | 0.6404 | 0.6585 | 0.7805 | 0.8500 | 0.4751 | 0.0055 | 0.1526 | 0.2926 | 0.7740 | 0.6154 | 0 |
|---|
| 82 | 0.0345 | 0 | 0.9231 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 0.8412 | 0.4145 | 0.2000 | 0.1519 | 0.0337 | 0.6966 | 0.5169 | 0.4045 | 0.2809 | 0.9106 | 0.9756 | 0.5000 | 0.5014 | 0.0378 | 0.0457 | 0.2436 | 0.7992 | 0.3846 | 0 |
|---|
| 83 | 0.8276 | 0 | 0.6154 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0.6876 | 0.7791 | 1.0000 | 0.2973 | 0.3034 | 0.3371 | 0.6404 | 0.7865 | 0.8427 | 0.7317 | 0.3659 | 0.5250 | 0.3518 | 0.8533 | 0.1437 | 0.5568 | 0.6018 | 0.6154 | 0 |
|---|
| 84 | 0.8276 | 0 | 0.3077 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 0.3131 | 0.8621 | 0.4000 | 0.0791 | 0.2247 | 0.7416 | 0.9326 | 0.6180 | 0.9775 | 0.6260 | 0.5366 | 0.6750 | 0.9734 | 0.1348 | 0.0759 | 0.2940 | 0.7635 | 0.9231 | 1 |
|---|
| 85 | 0.0690 | 1 | 0.9231 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 0.9462 | 0.3339 | 0 | 0.2402 | 0.2360 | 0.6180 | 0.2697 | 0.6067 | 0.3483 | 0.6829 | 0.3089 | 0.6250 | 0.9534 | 0.6850 | 0.0122 | 0.5530 | 0.9310 | 0.1538 | 1 |
|---|
| 86 | 0.2414 | 0 | 0.6923 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0.3552 | 0.5715 | 0 | 0.1513 | 0.2921 | 1.0000 | 0.4719 | 0.7528 | 0.9775 | 0.7967 | 0.5935 | 0.7250 | 0.5779 | 0.7232 | 0.0139 | 0.3238 | 0.2328 | 0.9231 | 0 |
|---|
| 87 | 0.6897 | 1 | 0.3846 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0.4179 | 0.0630 | 0.4000 | 0.4377 | 0.7865 | 0.3146 | 0.2247 | 0.6067 | 0.8539 | 1.0000 | 0.6992 | 0.5000 | 0.9725 | 0.5790 | 0.1098 | 0.7833 | 0.1185 | 0.9231 | 1 |
|---|
| 88 | 0.2414 | 1 | 0.4615 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0.3986 | 0.1145 | 0.2000 | 0.0996 | 0.6292 | 0.2472 | 0.6067 | 0.6517 | 0.5281 | 0.9268 | 0.2764 | 0.6500 | 0.6712 | 0.2688 | 0.1566 | 0.5732 | 0.8276 | 0.0769 | 1 |
|---|
| 89 | 0.1724 | 0 | 0.8462 | 1 | 1 | 1 | 0 | 1 | 0 | 1 | 0.8066 | 0.9401 | 0.8000 | 0.5140 | 0.4944 | 0.7753 | 0.7303 | 0.7978 | 0.2472 | 0.5203 | 0.5285 | 0.5750 | 0.5687 | 0.1972 | 0.1902 | 0.6841 | 0.6951 | 0.2308 | 0 |
|---|
| 90 | 0.8621 | 0 | 0.0769 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0.3345 | 0.2010 | 0.6000 | 0.2604 | 0.1011 | 0.0674 | 0.9326 | 0.1461 | 0.3596 | 0.7317 | 0.4715 | 1.0000 | 0.3201 | 0.8424 | 0.0807 | 0.4268 | 0.7085 | 0.6154 | 0 |
|---|
| 91 | 0.8966 | 1 | 0.9231 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0.3334 | 0.1418 | 0.4000 | 0.8980 | 0.8876 | 0.6404 | 0.9438 | 0 | 0.3483 | 0.5366 | 0.4715 | 0.9250 | 0.6358 | 0.4929 | 0.0088 | 0.4450 | 0.2935 | 0.7692 | 0 |
|---|
| 92 | 0.4483 | 1 | 0.2308 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 0.8125 | 0.0376 | 0.6000 | 0.4727 | 0.5169 | 0.3483 | 0.0337 | 0.2697 | 0.3708 | 1.0000 | 0.6016 | 0.6750 | 0.4064 | 0.4819 | 0.0999 | 0.7092 | 0.5395 | 0.9231 | 0 |
|---|
| 93 | 0.9655 | 1 | 0.2308 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0.8912 | 0.2587 | 0 | 0.0501 | 0.5506 | 0.7753 | 0.9551 | 0.8989 | 0.8652 | 0.8780 | 0.6667 | 0.8000 | 0.5269 | 0.2847 | 0 | 0 | 0 | 0 | 1 |
|---|
| 94 | 0.3793 | 0 | 0.3077 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0.0821 | 0.1595 | 0.2000 | 0.2869 | 0.8315 | 0.1348 | 0.3933 | 0.8539 | 0.5506 | 0.7886 | 0.7154 | 0.6500 | 0.7082 | 0.8415 | 0.0304 | 0.9187 | 0.4674 | 0.7692 | 1 |
|---|
| 95 | 0.8966 | 0 | 0.3077 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0.4681 | 0.2667 | 0.4000 | 0.3736 | 0.7978 | 0.7640 | 0.2809 | 0.8764 | 0.6067 | 0.4065 | 0.4390 | 0.7750 | 0.2268 | 0.9873 | 0.1769 | 0.6873 | 0.6153 | 0.2308 | 1 |
|---|
| 96 | 0.1724 | 1 | 0.4615 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0.8026 | 0.8435 | 0.6000 | 0.7829 | 0.3258 | 0.4719 | 0.2584 | 0.6067 | 0.2472 | 0.7642 | 0.3089 | 0.5250 | 0.3827 | 0.9693 | 0 | 0 | 0 | 0.4615 | 1 |
|---|
| 97 | 0.2759 | 0 | 0.6923 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0.4434 | 0.4206 | 0 | 0.9468 | 0.7303 | 0.3483 | 0.2472 | 0.2809 | 0.8876 | 0.7642 | 0.4634 | 0.9000 | 0.7271 | 0.7847 | 0.2139 | 0.9619 | 0.4873 | 0.6154 | 0 |
|---|
| 98 | 0.4138 | 0 | 0.6923 | 1 | 1 | 0 | 0 | 1 | 0 | 1 | 0.3623 | 0.1069 | 0.2000 | 0.7452 | 0.1348 | 0.0899 | 0.3371 | 0.0787 | 0.4270 | 0.6016 | 0.3008 | 0.5500 | 0.2770 | 0.5346 | 0.0742 | 0.1533 | 0.1801 | 0.9231 | 0 |
|---|
| 99 | 0.1379 | 1 | 0.9231 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0.9547 | 0.6710 | 0.6000 | 0.0787 | 0.1461 | 0.4157 | 0.7640 | 0.8202 | 0.9438 | 0.6423 | 0.9350 | 0.5500 | 0.6219 | 0.3204 | 0 | 0 | 0 | 0.8462 | 0 |
|---|
| 100 | 0.9655 | 1 | 0.4615 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0.6419 | 0.4627 | 0.4000 | 0.9043 | 0.7416 | 1.0000 | 0.3371 | 0.2809 | 0.1236 | 0.5854 | 0.4797 | 0.4500 | 0.0673 | 0.1471 | 0.0554 | 0.1998 | 0.1518 | 0.6923 | 0 |
|---|
| â‹® |
|---|
FEATURE SELECTION
Method 0 : TTests
[h0(i),p0(i)]=ttest2(HCV_new.(i)(HCV_new.Baselinehistologicalstaging==0),HCV_new.(i)(HCV_new.Baselinehistologicalstaging==1));
sortrows(array2table([h0',p0'],'RowNames',HCV_new.Properties.VariableNames(1:28),'VariableNames',{'Significant/Non-significant','P-value'}),'P-value')
ans = 28×2 table
| | Significant/Non-significant | P-value |
|---|
| 1 BMI | 1 | 0.0092 |
|---|
| 2 NauseaVomting | 1 | 0.0261 |
|---|
| 3 RNAEF | 1 | 0.0394 |
|---|
| 4 Epigastricpain | 0 | 0.0788 |
|---|
| 5 BaselinehistologicalGrading | 0 | 0.1271 |
|---|
| 6 Gender | 0 | 0.1321 |
|---|
| 7 ALTafter24w | 0 | 0.1648 |
|---|
| 8 RNABase | 0 | 0.1834 |
|---|
| 9 RNA4 | 0 | 0.2288 |
|---|
| 10 RNA12 | 0 | 0.2680 |
|---|
| 11 Age | 0 | 0.3591 |
|---|
| 12 ALT1 | 0 | 0.3929 |
|---|
| 13 Plat | 0 | 0.4424 |
|---|
| 14 Fever | 0 | 0.4759 |
|---|
| 15 AST1 | 0 | 0.5987 |
|---|
| 16 ALT48 | 0 | 0.6603 |
|---|
| 17 RNAEOT | 0 | 0.7020 |
|---|
| 18 WBC | 0 | 0.7548 |
|---|
| 19 ALT4 | 0 | 0.7675 |
|---|
| 20 ALT24 | 0 | 0.8578 |
|---|
| 21 ALT12 | 0 | 0.8641 |
|---|
| 22 RBC | 0 | 0.8968 |
|---|
| 23 ALT36 | 0 | 0.9179 |
|---|
| 24 Fatiguegeneralizedboneache | 0 | 0.9380 |
|---|
| 25 Diarrhea | 0 | 0.9411 |
|---|
| 26 Headache | 0 | 0.9441 |
|---|
| 27 Jaundice | 0 | 0.9763 |
|---|
| 28 HGB | 0 | 0.9900 |
|---|
A p-value of 0.0092,0.0261 and 0.0394 against significance level of 0.05 means that the two groups can be differentiated and that the variable is significant. Thus, according to the ttest BMI, NauseaVomting and RNAEF are significant variables.
HCV_test=array2table([HCV_new.BMI, HCV_new.RNAEF, HCV_new.NauseaVomting, HCV_new.Baselinehistologicalstaging],'VariableNames',{'BMI','RNAEF','NauseaVomting','Stage'});
The low correlation value can be explained by the slopes of least-squares reference lines.
% Boxplots of significant variables
boxplot(HCV_test.BMI,HCV_test.Stage)
xlabel('Cirrhosis Severity')
boxplot(HCV_test.NauseaVomting,HCV_test.Stage)
ylabel('Nausea and Vomiting')
xlabel('Cirrhosis Severity')
boxplot(HCV_test.RNAEF,HCV_test.Stage)
xlabel('Cirrhosis Severity')
METHOD 1
We use bias corrected Cramer’s V (derived from Chi-Square) to measure association between categorical variables, logistic regression to understand association between categorical and continuous variable pairs and rank-based correlation metrics such as Spearman to understand association between continuous variables.
Note : Catagorical data doesn't come from normal distribution and hence, as the output is categorical we cannot use linear regression (assumes normality)
Association between categorical-categorical variables
categorical_dat=HCV_new(:,[2,4:10,28]);
Categorical variables with 2 levels
[~,chi2(i),p1(i)] = crosstab(categorical_dat.(i),HCV_new.Baselinehistologicalstaging);
chi-square statistics is sensitive to sample size and so bias corrected cramer's V is preferred
For 2x2 contingency table , bias corrected cramer's V=phi-coefficient
bias_cramerV=sqrt(chi2/height(HCV));
Categorical variable with more than 2 levels
Baselinehistologicalgrading is a categorical variable which needs to be tested for it's strength of association. For 14x2 contingency table,
[tbl1,chi2(9,1),p1(9,1)] = crosstab(HCV_new.BaselinehistologicalGrading,HCV_new.Baselinehistologicalstaging);
%For bias corrected cramer's V
bias_cramerV(9,1)=sqrt(max([0,(chi2(9,1)/1385)-((tbl1_size(1)-1)*(tbl1_size(2)-1))/(height(HCV)-1)])/min([tbl1_size(2)-((tbl1_size(2)-1)^2)/(height(HCV)-1),tbl1_size(1)-((tbl1_size(1)-1)^2)/(height(HCV)-1)]));
Association of categorical variables with Stages is as follows -
categorical_summary=sortrows(array2table([chi2,p1,bias_cramerV],"RowNames",categorical_dat.Properties.VariableNames,'VariableNames',{'Chi-Square Statistic','P-value','CramersV_Strength'}),'P-value','ascend')
categorical_summary = 9×3 table
| | Chi-Square Statistic | P-value | CramersV_Strength |
|---|
| 1 NauseaVomting | 4.9507 | 0.0261 | 0.0598 |
|---|
| 2 Epigastricpain | 3.0912 | 0.0787 | 0.0472 |
|---|
| 3 Gender | 2.2702 | 0.1319 | 0.0405 |
|---|
| 4 Fever | 0.5090 | 0.4756 | 0.0192 |
|---|
| 5 BaselinehistologicalGrading | 9.1334 | 0.7628 | 0 |
|---|
| 6 Fatiguegeneralizedboneache | 0.0061 | 0.9380 | 0.0021 |
|---|
| 7 Diarrhea | 0.0055 | 0.9410 | 0.0020 |
|---|
| 8 Headache | 0.0049 | 0.9440 | 0.0019 |
|---|
| 9 Jaundice | 0.0009 | 0.9763 | 0.0008 |
|---|
On observing the above table, we see that according to strength of association (given by cramer's V coefficient) and p-value, if significance level 5% is considered, only NauseaVomiting seems to be a significant variable.
Association between categorical-continuous variables
continuous_var=HCV_new(:,[1,3,11:27]);
[B,dev,stats]=mnrfit(X(:,[1,3,11:27]),categorical(Y))
-0.1123
0.1901
0.4397
-0.1000
0.0302
-0.0278
0.1407
0.1238
-0.1801
0.0364
dev = 1.8968e+03
stats =
beta: [20×1 double]
dfe: 1365
sfit: 1.0073
s: 1
estdisp: 0
covb: [20×20 double]
coeffcorr: [20×20 double]
se: [20×1 double]
t: [20×1 double]
p: [20×1 double]
resid: [1385×2 double]
residp: [1385×2 double]
residd: [1385×1 double]
0.8131
0.2905
0.0118
0.5908
0.8733
0.8613
0.4535
0.5060
0.3375
0.8423
sortrows(array2table([B(2:end),stats.p(2:end)],"RowNames",continuous_var.Properties.VariableNames,'VariableNames',{'B-value','P-value'}),'P-value','ascend')
ans = 19×2 table
| | B-value | P-value |
|---|
| 1 BMI | 0.4397 | 0.0118 |
|---|
| 2 RNAEF | -0.3883 | 0.0445 |
|---|
| 3 RNAEOT | 0.3169 | 0.1026 |
|---|
| 4 ALTafter24w | -0.4199 | 0.1732 |
|---|
| 5 RNABase | -0.2310 | 0.2119 |
|---|
| 6 RNA4 | 0.1932 | 0.2856 |
|---|
| 7 Age | 0.1901 | 0.2905 |
|---|
| 8 ALT1 | -0.1801 | 0.3375 |
|---|
| 9 RNA12 | -0.6656 | 0.4157 |
|---|
| 10 Plat | 0.1407 | 0.4535 |
|---|
| 11 AST1 | 0.1238 | 0.5060 |
|---|
| 12 ALT48 | 0.1631 | 0.5254 |
|---|
| 13 WBC | -0.1000 | 0.5908 |
|---|
| 14 ALT24 | 0.0452 | 0.8075 |
|---|
| 15 ALT4 | 0.0364 | 0.8423 |
|---|
| 16 HGB | -0.0278 | 0.8613 |
|---|
| 17 RBC | 0.0302 | 0.8733 |
|---|
| 18 ALT12 | -0.0280 | 0.8807 |
|---|
| 19 ALT36 | -0.0066 | 0.9791 |
|---|
Here at 5% significance level, BMI followed by RNAEF are most significant.
Thus, NauseaVomiting, BMI and RNAEF are significant variables. Similar to Method 0.
METHOD 2
Only one method can be used for both types of features using embedded type feature selection. For this particular function when we give input as a table it assumes last column to be response variable and treats all categorical or logical variables as categorical.
Embedded Type Feature Selection
stepwiseglm(HCV)
1. Adding BMI, Deviance = 344.1219, FStat = 6.810953, PValue = 0.009157848
2. Adding NauseaVomting, Deviance = 342.8689, FStat = 5.050527, PValue = 0.02477567
ans =
Generalized linear regression model:
Baselinehistologicalstaging ~ 1 + BMI + NauseaVomting
Distribution = Normal
Estimated Coefficients:
Estimate SE tStat pValue
__________ _________ _______ __________
(Intercept) 0.73428 0.09579 7.6655 3.3388e-14
BMI -0.0086275 0.0032847 -2.6266 0.0087198
NauseaVomting 0.060158 0.026769 2.2473 0.024776
1385 observations, 1382 error degrees of freedom
Estimated Dispersion: 0.248
F-statistic vs. constant model: 5.94, p-value = 0.0027
Thus, BMI and NauseaVomiting only are sigificant variables.
METHOD 3
mdls=fitglm(HCV_new,'link','logit','Distribution',"binomial")
Generally we start with deleting the variables with highest p-value
while mdls.devianceTest.pValue(2)>0.05
[b3,i3]=maxk(mdls.Coefficients.pValue,1);
sprintf('Variable removed :%s',string(new.Properties.VariableNames(i-1)))
mdls=fitglm(new,'link','logit','Distribution',"binomial");
end
ans = 'Variable removed :Diarrhea'
ans = 'Variable removed :ALT36'
ans = 'Variable removed :Headache'
ans = 'Variable removed :HGB'
ans = 'Variable removed :ALT12'
mdls.devianceTest.pValue(2)
Here, we realise that all other methods look at every feature individually but this, looks at them as a whole and makes a linear model of those and the determines the p-value of the model.
METHOD 4
p41(i)=kruskalwallis(HCV_new.(i),HCV_new.Baselinehistologicalstaging,"off");
[~,~,p41(i)] = crosstab(HCV_new.(i),HCV_new.Baselinehistologicalstaging);
[r1,p42]=corr(X,Y,'type',"Pearson")
-0.0247
0.0405
-0.0700
-0.0192
0.0598
-0.0019
0.0020
-0.0021
-0.0008
-0.0472
0.3591
0.1321
0.0092
0.4759
0.0261
0.9441
0.9411
0.9380
0.9763
0.0788
sortrows(array2table([p41',r1,p42],'RowNames',HCV.Properties.VariableNames(1:28),'VariableNames',{'P-value','Pearsons_Coeff','Pearsons_P-value'}),'P-value')
ans = 28×3 table
| | P-value | Pearsons_Coeff | Pearsons_P-value |
|---|
| 1 BMI | 0.0089 | -0.0700 | 0.0092 |
|---|
| 2 NauseaVomting | 0.0261 | 0.0598 | 0.0261 |
|---|
| 3 RNAEF | 0.0543 | 0.0553 | 0.0394 |
|---|
| 4 Epigastricpain | 0.0787 | -0.0472 | 0.0788 |
|---|
| 5 Gender | 0.1319 | 0.0405 | 0.1321 |
|---|
| 6 BaselinehistologicalGrading | 0.1379 | -0.0410 | 0.1271 |
|---|
| 7 ALTafter24w | 0.1419 | 0.0373 | 0.1648 |
|---|
| 8 RNA12 | 0.1659 | 0.0298 | 0.2680 |
|---|
| 9 RNABase | 0.1823 | 0.0358 | 0.1834 |
|---|
| 10 RNA4 | 0.2354 | -0.0324 | 0.2288 |
|---|
| 11 Age | 0.3654 | -0.0247 | 0.3591 |
|---|
| 12 ALT1 | 0.3804 | 0.0230 | 0.3929 |
|---|
| 13 Plat | 0.4488 | -0.0207 | 0.4424 |
|---|
| 14 Fever | 0.4756 | -0.0192 | 0.4759 |
|---|
| 15 AST1 | 0.6030 | -0.0142 | 0.5987 |
|---|
| 16 ALT48 | 0.6776 | -0.0118 | 0.6603 |
|---|
| 17 ALT4 | 0.7387 | -0.0080 | 0.7675 |
|---|
| 18 WBC | 0.7479 | 0.0084 | 0.7548 |
|---|
| 19 ALT12 | 0.8594 | 0.0046 | 0.8641 |
|---|
| 20 ALT24 | 0.8728 | -0.0048 | 0.8578 |
|---|
| 21 RNAEOT | 0.8918 | -0.0103 | 0.7020 |
|---|
| 22 ALT36 | 0.8920 | 0.0028 | 0.9179 |
|---|
| 23 RBC | 0.9047 | -0.0035 | 0.8968 |
|---|
| 24 Fatiguegeneralizedboneache | 0.9380 | -0.0021 | 0.9380 |
|---|
| 25 Diarrhea | 0.9410 | 0.0020 | 0.9411 |
|---|
| 26 Headache | 0.9440 | -0.0019 | 0.9441 |
|---|
| 27 Jaundice | 0.9763 | -0.0008 | 0.9763 |
|---|
| 28 HGB | 0.9946 | -0.0003 | 0.9900 |
|---|
As the p-value increases, the strength of correlation given by Pearson's coefficient decreases. For significance level of 0.05, BMI, NauseaVomting and RNAEF are significant again just like method 1 for Pearson's correlation while BMI and NauseaVomting are significant like method 2.
METHOD 5
Unsupervised Learning - PCA
Xpca=zscore((HCV_array-min(HCV_array))./(max(HCV_array)-min(HCV_array)));
ExpectedOutput=table2array(HCV_new(:,29));
[coeff,score,latent,~,explained]=pca(Xpca)
-0.0445 -0.0331 -0.1547 -0.1982 -0.1955 0.0982 -0.1319 -0.1184 -0.2385 -0.0265 0.0644 0.3006 0.5806 0.1130 0.0612 -0.0897 0.2583 -0.0894 -0.1774 -0.1875 -0.1684 0.2457 -0.1664 0.1906 -0.1040 0.0157 0.1668 -0.0831 -0.0289
-0.0069 -0.0093 -0.0289 0.0738 0.2578 0.1900 0.2896 0.0796 -0.3043 0.3899 -0.0790 0.3485 -0.0788 -0.1738 0.0304 -0.0007 -0.2697 0.1679 0.0554 -0.2674 0.0435 0.1963 -0.3018 -0.2321 0.0825 -0.0792 0.0495 0.1051 0.0610
-0.0614 -0.0124 0.2295 0.4466 -0.2109 -0.1458 -0.1181 -0.0580 -0.1594 0.0366 -0.1910 0.0988 -0.0059 0.0442 0.2867 0.2187 -0.1550 0.2557 0.0180 0.0683 -0.1942 -0.1781 -0.1713 0.2143 -0.1471 0.4325 0.0796 -0.0387 0.0144
0.0187 -0.0003 -0.1485 0.1059 0.0245 -0.0969 -0.1100 0.4491 -0.0373 0.3947 0.1097 -0.1350 -0.1754 -0.1468 0.0374 -0.2470 0.3866 0.0646 -0.3261 -0.1910 -0.0501 -0.1270 0.1037 0.1722 0.1900 0.1955 0.0981 -0.0171 0.0268
0.0196 0.5239 -0.0087 0.0392 -0.0143 -0.1455 0.1143 0.0254 0.0860 -0.1818 -0.0612 -0.1647 0.0536 0.0792 -0.0992 0.1630 0.2692 0.2505 0.1548 -0.2171 0.1956 -0.1224 -0.3927 0.0797 0.1360 -0.1865 0.2981 0.0907 -0.0325
0.0071 0.1349 -0.1997 -0.2040 -0.0519 -0.1642 -0.3614 -0.1303 -0.0346 0.0966 -0.2767 0.0088 -0.1325 -0.3216 0.1269 -0.1500 0.1103 0.2553 0.1768 0.4345 -0.0052 0.3648 -0.1272 -0.0094 0.1491 -0.0236 -0.0692 -0.0384 0.0221
0.0381 0.1366 -0.3065 0.0907 -0.1760 0.3009 0.0106 -0.3903 -0.1089 0.2452 0.0141 -0.0104 -0.0026 -0.0273 -0.2374 0.0513 0.0300 -0.2977 0.0934 0.1057 0.0983 -0.3344 -0.1460 0.0955 0.3195 0.2472 -0.2016 0.0673 -0.0450
0.0389 -0.1280 -0.0237 0.0409 0.4860 -0.1945 0.0663 -0.0311 -0.2372 0.1134 0.0030 0.1074 0.1305 0.3650 0.2035 -0.0485 0.0867 -0.0980 0.2495 0.3162 0.2063 -0.1165 0.1648 0.2355 0.2070 -0.0639 0.2443 -0.0236 0.0205
0.0602 -0.0545 -0.1540 -0.2501 -0.0252 -0.0145 0.1026 0.1238 -0.1791 -0.3374 0.3913 0.2320 -0.3495 -0.0813 0.1177 0.0623 0.2175 -0.0259 0.4153 -0.0739 -0.0852 -0.0410 -0.0532 -0.0664 -0.0868 0.3619 0.0165 -0.0001 0.0178
0.0905 -0.3127 -0.0890 0.1407 0.1438 -0.2806 -0.1504 -0.4428 0.0214 0.0254 -0.0778 -0.0943 -0.0014 -0.1463 -0.1890 0.1445 0.1906 -0.0007 -0.0331 -0.2599 0.1144 0.0838 0.1369 -0.3718 -0.0491 0.2059 0.3418 0.0790 -0.0651
-1.8028 -1.3862 2.4746 0.7122 0.1476 -1.5091 -1.6015 0.2543 -3.6699 -0.7729 2.1479 -1.0909 -0.0221 -1.7456 -0.0120 -0.5084 0.2455 -2.1750 -0.5932 0.2340 0.5765 -0.3133 -0.4243 1.5533 -1.7405 0.4974 1.7108 0.4960 -0.3551
0.0844 0.0015 -0.5168 -0.3531 -0.6223 -1.2432 -0.2816 0.3392 -0.8735 -1.6623 -0.1674 0.6442 0.2792 1.2687 -0.1586 -0.3635 0.4801 2.3060 2.0581 0.7802 -2.1000 -1.0551 0.9003 -0.1619 0.8951 -1.9566 -0.5442 0.8371 -0.7297
0.5870 2.6263 2.5980 -1.2570 -2.2253 -1.2445 -1.2880 0.2439 -2.6348 1.1011 1.0349 -1.4355 0.7275 -1.5158 -1.7761 -0.7284 -0.9059 -1.1954 -1.5401 1.5609 0.3400 -0.4237 -1.6903 0.7462 -0.0081 0.8451 1.6401 -2.0231 -0.5049
2.1241 1.0550 0.0474 -0.7747 -1.8014 1.3647 1.9753 -0.2040 -0.6977 0.0585 0.2641 -0.0673 0.3126 -0.2032 0.8141 0.0668 -1.3120 0.4226 0.2715 0.2068 -0.3781 -1.5168 -1.3300 -0.5396 -0.9788 0.6570 1.3439 0.2774 -0.5394
6.7676 -0.0567 -0.0806 0.8354 -0.0746 -1.4898 -2.7998 -0.5408 -1.1052 -1.6936 1.4152 0.6738 1.3214 1.0340 1.9426 -0.9336 -0.9943 -0.2710 0.4942 0.5246 -2.2978 2.3795 1.1387 1.1837 -0.5406 0.2423 -0.4024 8.7847 -1.8595
-1.7332 -0.3437 -1.5178 -1.6311 -1.0583 -0.6985 1.2602 2.3499 -1.6521 -0.3821 -0.5325 0.9500 0.7040 -0.8247 0.6601 0.7614 1.2019 0.4247 0.4493 -0.4178 2.1893 1.1896 0.7698 0.5499 1.4908 -1.2113 0.3066 -0.1531 0.5038
0.5961 -0.8996 0.7741 -1.5298 0.5618 0.8234 0.7948 -1.7221 -0.3896 0.3430 0.3753 0.8471 -0.3689 -1.2636 -1.0044 0.1368 -0.6238 0.5759 1.1005 1.0454 0.2723 0.4953 0.9739 0.1185 1.5580 0.9910 1.0670 0.1311 1.1609
1.6028 0.4924 -1.2485 0.2076 1.2905 0.4181 -0.5366 -1.5979 -0.0699 0.2810 -1.3076 1.0052 0.4271 -1.2157 -1.8217 -0.3566 -1.4215 0.8653 -0.9192 0.3467 0.1887 0.4015 -0.0720 0.5674 -2.1246 1.0139 -1.0848 0.9301 -0.0315
-0.5467 -0.6806 -0.0878 -0.7944 0.7599 -1.4810 0.2170 -2.9743 1.0271 1.3806 0.5221 -0.0318 0.3773 0.4073 -0.7797 -0.5405 0.7199 0.1690 0.4961 0.7933 0.3819 0.0375 0.7999 0.1203 0.0243 -0.8475 -2.0621 -0.7941 -0.1826
0.5522 -0.5901 -0.0358 -1.4620 -0.5657 -1.2414 -1.4621 -1.5076 0.5011 2.2116 0.2220 -1.1710 0.1015 -1.4020 0.9085 0.7306 0.3091 -0.1297 -1.4528 0.0091 0.7819 -1.3788 -0.3872 0.1499 -1.0544 0.2975 -1.4980 -0.1603 0.7953
1.8974
1.2473
1.2360
1.1929
1.1630
1.1430
1.1291
1.1069
1.0609
1.0553
6.5429
4.3012
4.2622
4.1136
4.0104
3.9413
3.8934
3.8169
3.6582
3.6390
b=bar(explained(1:15))
b =
Bar with properties:
BarLayout: 'grouped'
BarWidth: 0.8000
FaceColor: [0 0.4470 0.7410]
EdgeColor: [0 0 0]
BaseValue: 0
XData: [1 2 3 4 5 6 7 8 9 10 11 12 13 14 15]
YData: [6.5429 4.3012 4.2622 4.1136 4.0104 3.9413 3.8934 3.8169 3.6582 3.6390 3.5928 3.5524 3.5298 3.4826 3.4101]
Show all properties
labels = string(round(b.YData,2));
text(xtips,ytips,labels,'HorizontalAlignment','center',...
'VerticalAlignment','bottom')
xlabel('Principal Component')
ylabel('Percentange of explained variances')
title('Principle Component Analysis')
gscatter(score(1:100,1), score(1:100,2), ExpectedOutput(1:100,1))
xlabel('Principal Component 1')
ylabel('Principal Component 2')
sortrows(array2table([coeff(:,1),coeff(:,2),coeff(:,3),coeff(:,4)],'RowNames',HCV_new.Properties.VariableNames(1:29),'VariableNames',{'PC1','PC2','PC3','PC4'}),'PC1','descend')
ans = 29×4 table
| | PC1 | PC2 | PC3 | PC4 |
|---|
| 1 RNAEF | 0.5708 | -0.0114 | 0.0411 | -0.0047 |
|---|
| 2 RNAEOT | 0.5628 | -0.0439 | 0.0724 | 0.0392 |
|---|
| 3 RNA12 | 0.5518 | -0.0101 | 0.0310 | 0.0457 |
|---|
| 4 Epigastricpain | 0.0905 | -0.3127 | -0.0890 | 0.1407 |
|---|
| 5 Jaundice | 0.0602 | -0.0545 | -0.1540 | -0.2501 |
|---|
| 6 ALT36 | 0.0552 | 0.1020 | -0.0894 | -0.1370 |
|---|
| 7 Baselinehistologicalstaging | 0.0522 | 0.2022 | -0.1248 | -0.2079 |
|---|
| 8 Plat | 0.0505 | 0.3816 | 0.0122 | 0.2792 |
|---|
| 9 ALT4 | 0.0408 | 0.0428 | 0.0827 | -0.2766 |
|---|
| 10 Fatiguegeneralizedboneache | 0.0389 | -0.1280 | -0.0237 | 0.0409 |
|---|
| 11 Diarrhea | 0.0381 | 0.1366 | -0.3065 | 0.0907 |
|---|
| 12 ALT48 | 0.0259 | -0.2446 | -0.3236 | 0.0773 |
|---|
| 13 ALTafter24w | 0.0203 | -0.1381 | -0.1882 | 0.0130 |
|---|
| 14 NauseaVomting | 0.0196 | 0.5239 | -0.0087 | 0.0392 |
|---|
| 15 Fever | 0.0187 | -0.0003 | -0.1485 | 0.1059 |
|---|
| 16 ALT24 | 0.0185 | 0.2015 | 0.3262 | -0.2542 |
|---|
| 17 RNABase | 0.0147 | 0.0051 | -0.3082 | -0.0953 |
|---|
| 18 Headache | 0.0071 | 0.1349 | -0.1997 | -0.2040 |
|---|
| 19 Gender | -0.0069 | -0.0093 | -0.0289 | 0.0738 |
|---|
| 20 HGB | -0.0125 | 0.0099 | 0.4005 | 0.1555 |
|---|
| 21 ALT1 | -0.0189 | 0.2657 | -0.3702 | 0.2814 |
|---|
| 22 AST1 | -0.0287 | 0.0170 | -0.0467 | 0.0505 |
|---|
| 23 Age | -0.0445 | -0.0331 | -0.1547 | -0.1982 |
|---|
| 24 ALT12 | -0.0472 | -0.1595 | 0.1466 | -0.3914 |
|---|
| 25 BaselinehistologicalGrading | -0.0524 | -0.1264 | -0.0108 | 0.1052 |
|---|
| 26 RNA4 | -0.0575 | -0.1885 | 0.0403 | 0.1741 |
|---|
| 27 BMI | -0.0614 | -0.0124 | 0.2295 | 0.4466 |
|---|
| 28 RBC | -0.0626 | 0.2296 | 0.1561 | -0.0386 |
|---|
| 29 WBC | -0.0800 | -0.2216 | 0.0844 | 0.0851 |
|---|
PC12=mean(abs(coeff(:,1:2)),2);
sortrows(table(PC12,'Rownames',HCV_new.Properties.VariableNames),'PC12','descend')
ans = 29×1 table
| | PC12 |
|---|
| 1 RNAEOT | 0.3033 |
|---|
| 2 RNAEF | 0.2911 |
|---|
| 3 RNA12 | 0.2809 |
|---|
| 4 NauseaVomting | 0.2717 |
|---|
| 5 Plat | 0.2161 |
|---|
| 6 Epigastricpain | 0.2016 |
|---|
| 7 WBC | 0.1508 |
|---|
| 8 RBC | 0.1461 |
|---|
| 9 ALT1 | 0.1423 |
|---|
| 10 ALT48 | 0.1352 |
|---|
| 11 Baselinehistologicalstaging | 0.1272 |
|---|
| 12 RNA4 | 0.1230 |
|---|
| 13 ALT24 | 0.1100 |
|---|
| 14 ALT12 | 0.1034 |
|---|
| 15 BaselinehistologicalGrading | 0.0894 |
|---|
| 16 Diarrhea | 0.0874 |
|---|
| 17 Fatiguegeneralizedboneache | 0.0835 |
|---|
| 18 ALTafter24w | 0.0792 |
|---|
| 19 ALT36 | 0.0786 |
|---|
| 20 Headache | 0.0710 |
|---|
| 21 Jaundice | 0.0574 |
|---|
| 22 ALT4 | 0.0418 |
|---|
| 23 Age | 0.0388 |
|---|
| 24 BMI | 0.0369 |
|---|
| 25 AST1 | 0.0228 |
|---|
| 26 HGB | 0.0112 |
|---|
| 27 RNABase | 0.0099 |
|---|
| 28 Fever | 0.0095 |
|---|
| 29 Gender | 0.0081 |
|---|
PC14=mean(abs(coeff(:,1:4)),2);
sortrows(table(PC14,'Rownames',HCV_new.Properties.VariableNames),'PC14','descend')
ans = 8×1 table
| | PC14 |
|---|
| 1 ALT1 | 0.2340 |
|---|
| 2 ALT24 | 0.2001 |
|---|
| 3 BMI | 0.1875 |
|---|
| 4 ALT12 | 0.1862 |
|---|
| 5 Plat | 0.1809 |
|---|
| 6 RNAEOT | 0.1796 |
|---|
| 7 ALT48 | 0.1679 |
|---|
| 8 RNA12 | 0.1596 |
|---|
%Variables removed by thresholding PC1-PC2 interaction
del1=X4.Properties.VariableNames(find(PC12<0.1050)')
'Age' 'Gender' 'BMI' 'Fever' 'Headache' 'Diarrhea' 'Fatiguegeneralizedboneache' 'Jaundice' 'HGB' 'AST1' 'ALT4' 'ALT12' 'ALT36' 'ALTafter24w' 'RNABase' 'BaselinehistologicalGrading'
X4=removevars(X4,del1)
X4 = 1385×13 table
| | NauseaVomting | Epigastricpain | WBC | RBC | Plat | ALT1 | ALT24 | ALT48 | RNA4 | RNA12 | RNAEOT | RNAEF | Baselinehistologicalstaging |
|---|
| 1 | 0 | 1 | 0.4867 | 0.3597 | 0.1433 | 0.5056 | 0.4719 | 0 | 0.5280 | 0.0772 | 0 | 0 | 0 |
|---|
| 2 | 1 | 0 | 1.0000 | 0.5100 | 0.2724 | 0.9438 | 0.8315 | 0.9593 | 0.4482 | 0.1707 | 0.4166 | 0.0384 | 0 |
|---|
| 3 | 1 | 0 | 0.1303 | 0.6695 | 0.4384 | 0.1124 | 0.8652 | 0 | 0.5503 | 0 | 0.9103 | 0.6896 | 1 |
|---|
| 4 | 1 | 0 | 0.3841 | 0.8138 | 0.4005 | 0.2809 | 0.5506 | 0.5854 | 0.3744 | 0.1570 | 0.9209 | 0.7186 | 1 |
|---|
| 5 | 0 | 1 | 0.0735 | 0.6572 | 0.7094 | 0.7303 | 0.9101 | 0.6911 | 0.6147 | 1.0000 | 0.4193 | 0.2997 | 0 |
|---|
| 6 | 1 | 0 | 0.9653 | 0.0549 | 0.2864 | 0.7303 | 0.2921 | 0.8862 | 0.9044 | 0 | 0 | 0 | 1 |
|---|
| 7 | 0 | 1 | 0.9472 | 0.7744 | 0.6313 | 0.2022 | 0.7640 | 0.6098 | 0.8604 | 0.0737 | 0.2654 | 0.7838 | 1 |
|---|
| 8 | 0 | 1 | 0.4768 | 0.4904 | 0.9229 | 0.8202 | 0.0674 | 0.3902 | 0.0600 | 0.2110 | 0.4584 | 0.6248 | 1 |
|---|
| 9 | 0 | 1 | 0.8221 | 0.6589 | 0.4187 | 0.4944 | 0.6517 | 0.2764 | 0.6302 | 0 | 0.4590 | 0.2506 | 0 |
|---|
| 10 | 0 | 1 | 0.4050 | 0.5315 | 0.0389 | 0.3258 | 0.4719 | 0.3089 | 0.1918 | 0.0716 | 0.3405 | 0.6855 | 0 |
|---|
| 11 | 0 | 0 | 0.1587 | 0.3732 | 0.5471 | 0.9551 | 0.1573 | 0.7805 | 0.0859 | 0.1961 | 0.5547 | 0.0740 | 0 |
|---|
| 12 | 1 | 0 | 0.3360 | 0.2611 | 0.3841 | 0.1124 | 0.0787 | 0.4390 | 0.9345 | 0.1505 | 0.0781 | 0.9949 | 0 |
|---|
| 13 | 0 | 1 | 0.6902 | 0.2500 | 0.8242 | 0.6966 | 0.6292 | 0.8049 | 0.4468 | 0 | 0 | 0 | 0 |
|---|
| 14 | 1 | 0 | 0.2906 | 0.4203 | 0.3604 | 0.2472 | 0.4382 | 0.8780 | 0.7359 | 0.1573 | 0.2261 | 0.9652 | 0 |
|---|
| 15 | 1 | 0 | 0.9348 | 0.2905 | 0.9731 | 0.9326 | 0.5506 | 0.8618 | 0.9252 | 0.1129 | 0.5412 | 0.1538 | 0 |
|---|
| 16 | 0 | 0 | 0.5537 | 0.8985 | 0.4233 | 0.1573 | 0.9101 | 0.6585 | 0.2615 | 0.0224 | 0.8306 | 0.1668 | 0 |
|---|
| 17 | 1 | 1 | 0.8619 | 0.2902 | 0.7840 | 0.7416 | 0.4944 | 0.3415 | 0.6004 | 0 | 0 | 0 | 1 |
|---|
| 18 | 0 | 1 | 0.6021 | 0.5411 | 0.5265 | 0.6966 | 0.8202 | 0.5122 | 0.1922 | 0.1227 | 0.3938 | 0.3164 | 1 |
|---|
| 19 | 1 | 1 | 0.3960 | 0.5258 | 0.7316 | 0.9551 | 0.2022 | 0.7154 | 0.2989 | 0.1992 | 0.5012 | 0.2011 | 1 |
|---|
| 20 | 1 | 1 | 0.2035 | 0.5155 | 0.1409 | 0.9213 | 0.6292 | 0.7154 | 0.0636 | 0.1922 | 0.5001 | 0.5895 | 1 |
|---|
| 21 | 0 | 0 | 0.3221 | 0.1790 | 0.1764 | 0.7865 | 0.1798 | 0.6423 | 0.6541 | 0.1793 | 0.6570 | 0.3486 | 0 |
|---|
| 22 | 0 | 1 | 0.7641 | 0.9803 | 0.1200 | 0.4270 | 0.8764 | 0.3008 | 0.4766 | 0 | 0 | 0 | 0 |
|---|
| 23 | 1 | 1 | 0.5456 | 0.6481 | 0.0129 | 0.5955 | 0.0112 | 0.7154 | 0.3666 | 0.0142 | 0.0226 | 0.2312 | 1 |
|---|
| 24 | 1 | 1 | 0.4550 | 0.6729 | 0.8868 | 0.7079 | 0.8090 | 0.4309 | 0.8260 | 0.0259 | 0.4142 | 0.9413 | 1 |
|---|
| 25 | 1 | 1 | 0.3366 | 0.4029 | 0.9676 | 0.5843 | 0.0225 | 0.8211 | 0.8166 | 0.0033 | 0.4453 | 0.9297 | 1 |
|---|
| 26 | 1 | 0 | 0.4014 | 0.5931 | 0.1263 | 0.8202 | 0.2247 | 0.5935 | 0.4499 | 0 | 0 | 0 | 1 |
|---|
| 27 | 0 | 1 | 0.8827 | 0.1965 | 0.0112 | 0.1685 | 0.3596 | 0.6667 | 0.4421 | 0.1977 | 0.9090 | 0.4601 | 0 |
|---|
| 28 | 0 | 1 | 0.2462 | 0.9066 | 0.7291 | 0.9101 | 0.5506 | 0.7642 | 0.0383 | 0.0122 | 0.9070 | 0.0241 | 0 |
|---|
| 29 | 1 | 1 | 0.3345 | 0.7877 | 0.2526 | 0.8876 | 0.6067 | 0.6341 | 0.1553 | 0 | 0 | 0 | 1 |
|---|
| 30 | 0 | 1 | 0.3134 | 0.7796 | 0.0878 | 0.3708 | 0.0674 | 0.3577 | 0.2062 | 0.2055 | 0.4665 | 0.3889 | 0 |
|---|
| 31 | 1 | 1 | 0.2626 | 0.1522 | 0.6688 | 0.1124 | 0.2584 | 0.6179 | 0.7728 | 0.0080 | 0.1537 | 0.3012 | 0 |
|---|
| 32 | 1 | 0 | 0.4816 | 0.1518 | 0.8100 | 0.7978 | 0.3371 | 0.5447 | 0.2394 | 0.1671 | 0.0827 | 0.0432 | 0 |
|---|
| 33 | 0 | 1 | 0.4934 | 0.6515 | 0.5571 | 0.2809 | 0.2809 | 0.7398 | 0.3464 | 0.0867 | 0.8865 | 0.8374 | 0 |
|---|
| 34 | 1 | 0 | 0.9637 | 0.6362 | 0.2445 | 0.0899 | 0.1011 | 0.3902 | 0.2650 | 0.1241 | 0.4717 | 0.3491 | 0 |
|---|
| 35 | 1 | 1 | 0.3787 | 0.2155 | 0.1928 | 0.8876 | 0.1011 | 0.7805 | 0.0762 | 0.1084 | 0.8338 | 0.2993 | 1 |
|---|
| 36 | 1 | 1 | 0.8027 | 0.2797 | 0.2083 | 0.7079 | 0.4607 | 0.9919 | 0.0429 | 0.0984 | 0.7273 | 0.9210 | 1 |
|---|
| 37 | 1 | 0 | 0.4801 | 0.1720 | 0.9701 | 0.9775 | 0.9775 | 0.3984 | 0.4070 | 0.1237 | 0.4156 | 0.3545 | 0 |
|---|
| 38 | 0 | 0 | 0.8467 | 0.9111 | 0.5898 | 0.0449 | 0.6966 | 0.6992 | 0.4835 | 0.2116 | 0.3252 | 0.1468 | 0 |
|---|
| 39 | 0 | 0 | 0.0020 | 0.4474 | 0.0194 | 0.2135 | 0.2360 | 0.8862 | 0.0592 | 0.0076 | 0.0384 | 0.0017 | 0 |
|---|
| 40 | 1 | 1 | 0.7645 | 0.1024 | 0.7750 | 0.4719 | 0.3258 | 1.0000 | 0.0853 | 0.1092 | 0.2721 | 0.5004 | 1 |
|---|
| 41 | 1 | 1 | 0.3991 | 0.0609 | 0.6735 | 0.3371 | 0.0899 | 0.8618 | 0.2666 | 0.0936 | 0.6764 | 0.7947 | 0 |
|---|
| 42 | 1 | 0 | 0.8125 | 0.4095 | 0.6826 | 0.1910 | 0.6966 | 0.6911 | 0.1717 | 0.0405 | 0.3806 | 0.2154 | 0 |
|---|
| 43 | 0 | 1 | 0.7953 | 0.8165 | 0.0637 | 0.5955 | 0.6404 | 0.8374 | 0.8831 | 0.0619 | 0.2499 | 0.3626 | 1 |
|---|
| 44 | 1 | 1 | 0.1532 | 0.7649 | 0.6344 | 0.1011 | 0.4831 | 0.7805 | 0.9627 | 0.1170 | 0.9008 | 0.8457 | 1 |
|---|
| 45 | 1 | 1 | 0.9806 | 0.0716 | 0.0358 | 0.5618 | 0.2697 | 0.8862 | 0.3518 | 0.1104 | 0.1132 | 0.4645 | 1 |
|---|
| 46 | 1 | 1 | 0.7847 | 0.6273 | 0.8753 | 0.8652 | 0.5056 | 0.5366 | 0.7221 | 0.2033 | 0.3954 | 0.1972 | 0 |
|---|
| 47 | 0 | 1 | 0.0526 | 0.3295 | 0.4296 | 0.3258 | 0.8764 | 0.3902 | 0.6870 | 0.0954 | 0.1249 | 0.7826 | 1 |
|---|
| 48 | 0 | 1 | 0.2666 | 0.3873 | 0.7042 | 0.0112 | 0.4382 | 0.4228 | 0.9964 | 0.0484 | 0.6488 | 0.3601 | 0 |
|---|
| 49 | 0 | 0 | 0.4360 | 0.9617 | 0.7223 | 0.1011 | 0.3483 | 0.8537 | 0.5821 | 0.1818 | 0.9472 | 0.6743 | 1 |
|---|
| 50 | 0 | 1 | 0.3576 | 0.4253 | 0.3378 | 0.7865 | 0.6292 | 0.5447 | 0.0412 | 0.0165 | 0.8715 | 0.3002 | 0 |
|---|
| 51 | 0 | 0 | 0.2308 | 0.7181 | 0.2269 | 0.3596 | 0.4831 | 0.8130 | 0.8311 | 0.2026 | 0.2165 | 0.2981 | 1 |
|---|
| 52 | 0 | 1 | 0.1982 | 0.1978 | 0.6211 | 0.2135 | 0.4607 | 0.7805 | 0.1088 | 0.0116 | 0.3718 | 0.1137 | 0 |
|---|
| 53 | 0 | 1 | 0.2250 | 0.2718 | 0.2088 | 0.6067 | 0.1910 | 0.7724 | 0.3048 | 0.0036 | 0.7114 | 0.4092 | 0 |
|---|
| 54 | 0 | 1 | 0.4292 | 0.5036 | 0.8937 | 0.0225 | 0.0562 | 0.9756 | 0.2146 | 0.1025 | 0.8321 | 0.4753 | 0 |
|---|
| 55 | 1 | 0 | 0.4682 | 0.1025 | 0.7852 | 0.5056 | 0.8315 | 0.5041 | 0.4777 | 0.0954 | 0.0971 | 0.7492 | 0 |
|---|
| 56 | 0 | 0 | 0.5739 | 0.1556 | 0.9343 | 0.1685 | 0 | 0.3333 | 0.1736 | 0.0714 | 0.0664 | 0.5940 | 0 |
|---|
| 57 | 1 | 0 | 0.1566 | 0.6947 | 0.6819 | 0.4045 | 0.6292 | 0.3171 | 0.7325 | 0.0469 | 0.2242 | 0.2272 | 1 |
|---|
| 58 | 0 | 1 | 0.3696 | 0.2768 | 0.2095 | 0.3146 | 0.3596 | 0.8618 | 0.1252 | 0.1568 | 0.0078 | 0.6631 | 1 |
|---|
| 59 | 0 | 1 | 0.6233 | 0.6654 | 0.3420 | 0.0787 | 0.5056 | 0.4146 | 0.4975 | 0.0910 | 0.1447 | 0.9798 | 0 |
|---|
| 60 | 0 | 1 | 0.7074 | 0.2495 | 0.2797 | 0.2697 | 0.0225 | 0.2927 | 0.9588 | 0.0250 | 0.7910 | 0.0785 | 0 |
|---|
| 61 | 1 | 1 | 0.8950 | 0.8549 | 0.5402 | 0.3820 | 0.7640 | 0.3008 | 0.3896 | 0.0261 | 0.3329 | 0.2017 | 1 |
|---|
| 62 | 0 | 1 | 0.2271 | 0.5141 | 0.3390 | 0.2697 | 0.8315 | 0.6829 | 0.1992 | 0.2025 | 0.3590 | 0.7945 | 1 |
|---|
| 63 | 0 | 0 | 0.5241 | 0.1211 | 0.2444 | 0.5618 | 0.0337 | 0.4065 | 0.4533 | 0.1985 | 0.5795 | 0.2571 | 0 |
|---|
| 64 | 0 | 1 | 0.8659 | 0.2341 | 0.5304 | 0.9775 | 0.7191 | 0.8862 | 0.1798 | 0.0796 | 0.9347 | 0.9626 | 0 |
|---|
| 65 | 1 | 1 | 0.9329 | 0.0521 | 0.6354 | 0.2921 | 0.6854 | 0.8537 | 0.9234 | 0.1757 | 0.9319 | 0.6844 | 0 |
|---|
| 66 | 1 | 0 | 0.1198 | 0.7609 | 0.9485 | 0.5281 | 1.0000 | 0.9268 | 0.0573 | 0.1695 | 0.2569 | 0.3195 | 1 |
|---|
| 67 | 1 | 0 | 0.2291 | 0.6986 | 0.7945 | 0.6629 | 0.9551 | 0.3252 | 0.5943 | 0.0249 | 0.3886 | 0.1268 | 0 |
|---|
| 68 | 0 | 1 | 0.1744 | 0.5596 | 0.5806 | 0.7191 | 0.8989 | 0.4228 | 0.2418 | 0.0688 | 0.1171 | 0.6714 | 1 |
|---|
| 69 | 0 | 1 | 0.5480 | 0.8783 | 0.9333 | 0.9888 | 0.6629 | 0.4553 | 0.3876 | 0.1026 | 0.9592 | 0.0302 | 0 |
|---|
| 70 | 1 | 1 | 0.2754 | 0.1342 | 0.8278 | 0.0562 | 0.7865 | 0.4146 | 0.6137 | 0.0609 | 0.5073 | 0.7858 | 1 |
|---|
| 71 | 1 | 0 | 0.1059 | 0.4158 | 0.1443 | 0.9888 | 0.5843 | 0.5772 | 0.3746 | 0.0987 | 0.4659 | 0.8688 | 0 |
|---|
| 72 | 0 | 1 | 0.7180 | 0.6420 | 0.5212 | 0.4382 | 0.0112 | 0.7317 | 0.9642 | 0.1893 | 0.3216 | 0.2351 | 0 |
|---|
| 73 | 1 | 0 | 0.0619 | 0.6537 | 0.5012 | 0.2472 | 0.4944 | 0.8211 | 0.7722 | 0.1487 | 0.4640 | 0.1944 | 1 |
|---|
| 74 | 0 | 1 | 0.1454 | 0.5514 | 0.4385 | 0.8315 | 0.0674 | 1.0000 | 0.0830 | 0.2111 | 0.5116 | 0.3912 | 1 |
|---|
| 75 | 1 | 1 | 0.4450 | 0.6702 | 0.0618 | 0.9888 | 0.5393 | 0.5691 | 0.0007 | 0.1394 | 0.5987 | 0.5632 | 1 |
|---|
| 76 | 1 | 1 | 0.5432 | 0.4275 | 0.4390 | 0.6966 | 0.6292 | 0.3171 | 0.9544 | 0.0056 | 0.1685 | 0.2125 | 0 |
|---|
| 77 | 0 | 1 | 0.9883 | 0.3147 | 0.5987 | 0.2584 | 0.6180 | 0.6829 | 0.1142 | 0.1475 | 0.3731 | 0.2895 | 1 |
|---|
| 78 | 1 | 1 | 0.6453 | 0.8337 | 0.3198 | 0.2697 | 0.9101 | 0.7886 | 0.4313 | 0.1639 | 0.5642 | 0.4970 | 0 |
|---|
| 79 | 1 | 0 | 0.1382 | 0.5442 | 0.9565 | 0.1798 | 0.0787 | 0.9431 | 0.9058 | 0.0195 | 0.9834 | 0.5352 | 1 |
|---|
| 80 | 1 | 1 | 0.6269 | 0.4080 | 0.9755 | 0.6966 | 0.3146 | 0.5122 | 0.3252 | 0.0528 | 0.4804 | 0.4098 | 1 |
|---|
| 81 | 1 | 0 | 0.2765 | 0.7185 | 0.1713 | 0.2697 | 0.6404 | 0.7805 | 0.0055 | 0.1526 | 0.2926 | 0.7740 | 0 |
|---|
| 82 | 1 | 1 | 0.8412 | 0.4145 | 0.1519 | 0.6966 | 0.2809 | 0.9756 | 0.0378 | 0.0457 | 0.2436 | 0.7992 | 0 |
|---|
| 83 | 0 | 1 | 0.6876 | 0.7791 | 0.2973 | 0.3371 | 0.8427 | 0.3659 | 0.8533 | 0.1437 | 0.5568 | 0.6018 | 0 |
|---|
| 84 | 1 | 1 | 0.3131 | 0.8621 | 0.0791 | 0.7416 | 0.9775 | 0.5366 | 0.1348 | 0.0759 | 0.2940 | 0.7635 | 1 |
|---|
| 85 | 0 | 1 | 0.9462 | 0.3339 | 0.2402 | 0.6180 | 0.3483 | 0.3089 | 0.6850 | 0.0122 | 0.5530 | 0.9310 | 1 |
|---|
| 86 | 1 | 1 | 0.3552 | 0.5715 | 0.1513 | 1.0000 | 0.9775 | 0.5935 | 0.7232 | 0.0139 | 0.3238 | 0.2328 | 0 |
|---|
| 87 | 1 | 0 | 0.4179 | 0.0630 | 0.4377 | 0.3146 | 0.8539 | 0.6992 | 0.5790 | 0.1098 | 0.7833 | 0.1185 | 1 |
|---|
| 88 | 1 | 1 | 0.3986 | 0.1145 | 0.0996 | 0.2472 | 0.5281 | 0.2764 | 0.2688 | 0.1566 | 0.5732 | 0.8276 | 1 |
|---|
| 89 | 1 | 1 | 0.8066 | 0.9401 | 0.5140 | 0.7753 | 0.2472 | 0.5285 | 0.1972 | 0.1902 | 0.6841 | 0.6951 | 0 |
|---|
| 90 | 0 | 0 | 0.3345 | 0.2010 | 0.2604 | 0.0674 | 0.3596 | 0.4715 | 0.8424 | 0.0807 | 0.4268 | 0.7085 | 0 |
|---|
| 91 | 0 | 0 | 0.3334 | 0.1418 | 0.8980 | 0.6404 | 0.3483 | 0.4715 | 0.4929 | 0.0088 | 0.4450 | 0.2935 | 0 |
|---|
| 92 | 1 | 1 | 0.8125 | 0.0376 | 0.4727 | 0.3483 | 0.3708 | 0.6016 | 0.4819 | 0.0999 | 0.7092 | 0.5395 | 0 |
|---|
| 93 | 0 | 0 | 0.8912 | 0.2587 | 0.0501 | 0.7753 | 0.8652 | 0.6667 | 0.2847 | 0 | 0 | 0 | 1 |
|---|
| 94 | 1 | 1 | 0.0821 | 0.1595 | 0.2869 | 0.1348 | 0.5506 | 0.7154 | 0.8415 | 0.0304 | 0.9187 | 0.4674 | 1 |
|---|
| 95 | 0 | 1 | 0.4681 | 0.2667 | 0.3736 | 0.7640 | 0.6067 | 0.4390 | 0.9873 | 0.1769 | 0.6873 | 0.6153 | 1 |
|---|
| 96 | 1 | 0 | 0.8026 | 0.8435 | 0.7829 | 0.4719 | 0.2472 | 0.3089 | 0.9693 | 0 | 0 | 0 | 1 |
|---|
| 97 | 1 | 0 | 0.4434 | 0.4206 | 0.9468 | 0.3483 | 0.8876 | 0.4634 | 0.7847 | 0.2139 | 0.9619 | 0.4873 | 0 |
|---|
| 98 | 1 | 1 | 0.3623 | 0.1069 | 0.7452 | 0.0899 | 0.4270 | 0.3008 | 0.5346 | 0.0742 | 0.1533 | 0.1801 | 0 |
|---|
| 99 | 0 | 0 | 0.9547 | 0.6710 | 0.0787 | 0.4157 | 0.9438 | 0.9350 | 0.3204 | 0 | 0 | 0 | 0 |
|---|
| 100 | 1 | 1 | 0.6419 | 0.4627 | 0.9043 | 1.0000 | 0.1236 | 0.4797 | 0.1471 | 0.0554 | 0.1998 | 0.1518 | 0 |
|---|
| â‹® |
|---|
%Variables removed by thresholding PC1-PC4 interaction
del2=X5.Properties.VariableNames(find(PC14<0.1050)')
'Gender' 'Fever' 'Fatiguegeneralizedboneache' 'AST1' 'ALT36' 'ALTafter24w' 'BaselinehistologicalGrading'
X5=removevars(X5,del2)
X5 = 1385×22 table
| | Age | BMI | NauseaVomting | Headache | Diarrhea | Jaundice | Epigastricpain | WBC | RBC | HGB | Plat | ALT1 | ALT4 | ALT12 | ALT24 | ALT48 | RNABase | RNA4 | RNA12 | RNAEOT | RNAEF | Baselinehistologicalstaging |
|---|
| 1 | 0.8276 | 1.0000 | 0 | 0 | 0 | 1 | 1 | 0.4867 | 0.3597 | 0.8000 | 0.1433 | 0.5056 | 0.1461 | 0.7865 | 0.4719 | 0 | 0.5456 | 0.5280 | 0.0772 | 0 | 0 | 0 |
|---|
| 2 | 0.4828 | 0.5385 | 1 | 1 | 0 | 1 | 0 | 1.0000 | 0.5100 | 0 | 0.2724 | 0.9438 | 0.6292 | 0.4045 | 0.8315 | 0.9593 | 0.0338 | 0.4482 | 0.1707 | 0.4166 | 0.0384 | 0 |
|---|
| 3 | 0.8621 | 0.8462 | 1 | 1 | 1 | 0 | 0 | 0.1303 | 0.6695 | 0.4000 | 0.4384 | 0.1124 | 0.6292 | 0.7640 | 0.8652 | 0 | 0.4755 | 0.5503 | 0 | 0.9103 | 0.6896 | 1 |
|---|
| 4 | 0.5862 | 0.8462 | 1 | 0 | 1 | 1 | 0 | 0.3841 | 0.8138 | 0 | 0.4005 | 0.2809 | 0.7865 | 0.4607 | 0.5506 | 0.5854 | 0.8675 | 0.3744 | 0.1570 | 0.9209 | 0.7186 | 1 |
|---|
| 5 | 0.9310 | 0.7692 | 0 | 1 | 0 | 1 | 1 | 0.0735 | 0.6572 | 0.2000 | 0.7094 | 0.7303 | 0.3146 | 0.1011 | 0.9101 | 0.6911 | 0.5498 | 0.6147 | 1.0000 | 0.4193 | 0.2997 | 0 |
|---|
| 6 | 0.8966 | 0 | 1 | 1 | 0 | 1 | 0 | 0.9653 | 0.0549 | 1.0000 | 0.2864 | 0.7303 | 0.9213 | 0.6404 | 0.2921 | 0.8862 | 0.9637 | 0.9044 | 0 | 0 | 0 | 1 |
|---|
| 7 | 0.3448 | 0.3077 | 0 | 1 | 1 | 1 | 1 | 0.9472 | 0.7744 | 0.4000 | 0.6313 | 0.2022 | 0.8315 | 0.8876 | 0.7640 | 0.6098 | 0.2712 | 0.8604 | 0.0737 | 0.2654 | 0.7838 | 1 |
|---|
| 8 | 0.5517 | 0.6154 | 0 | 1 | 1 | 0 | 1 | 0.4768 | 0.4904 | 0.2000 | 0.9229 | 0.8202 | 0.4607 | 0.9888 | 0.0674 | 0.3902 | 0.5338 | 0.0600 | 0.2110 | 0.4584 | 0.6248 | 1 |
|---|
| 9 | 0.4138 | 0.0769 | 0 | 1 | 1 | 0 | 1 | 0.8221 | 0.6589 | 0.4000 | 0.4187 | 0.4944 | 0.1798 | 0.7079 | 0.6517 | 0.2764 | 0.4924 | 0.6302 | 0 | 0.4590 | 0.2506 | 0 |
|---|
| 10 | 0.4483 | 0.6154 | 0 | 1 | 1 | 0 | 1 | 0.4050 | 0.5315 | 0.4000 | 0.0389 | 0.3258 | 0.3708 | 0.9888 | 0.4719 | 0.3089 | 0.9585 | 0.1918 | 0.0716 | 0.3405 | 0.6855 | 0 |
|---|
| 11 | 0.1724 | 0.1538 | 0 | 1 | 0 | 1 | 0 | 0.1587 | 0.3732 | 0.4000 | 0.5471 | 0.9551 | 0.8090 | 0.3933 | 0.1573 | 0.7805 | 0.8518 | 0.0859 | 0.1961 | 0.5547 | 0.0740 | 0 |
|---|
| 12 | 0.1379 | 0 | 1 | 0 | 0 | 0 | 0 | 0.3360 | 0.2611 | 0.6000 | 0.3841 | 0.1124 | 0.6067 | 0.1461 | 0.0787 | 0.4390 | 0.1146 | 0.9345 | 0.1505 | 0.0781 | 0.9949 | 0 |
|---|
| 13 | 0.4483 | 0.2308 | 0 | 0 | 0 | 0 | 1 | 0.6902 | 0.2500 | 0.6000 | 0.8242 | 0.6966 | 0.3034 | 0.1573 | 0.6292 | 0.8049 | 0.7797 | 0.4468 | 0 | 0 | 0 | 0 |
|---|
| 14 | 0.0690 | 0 | 1 | 0 | 0 | 1 | 0 | 0.2906 | 0.4203 | 0.8000 | 0.3604 | 0.2472 | 0.2809 | 0.1348 | 0.4382 | 0.8780 | 0.3272 | 0.7359 | 0.1573 | 0.2261 | 0.9652 | 0 |
|---|
| 15 | 0.2759 | 0.7692 | 1 | 1 | 0 | 0 | 0 | 0.9348 | 0.2905 | 0.8000 | 0.9731 | 0.9326 | 0.7528 | 0.7416 | 0.5506 | 0.8618 | 0.9439 | 0.9252 | 0.1129 | 0.5412 | 0.1538 | 0 |
|---|
| 16 | 0.8966 | 0.9231 | 0 | 0 | 0 | 0 | 0 | 0.5537 | 0.8985 | 0.2000 | 0.4233 | 0.1573 | 0.1236 | 0.4607 | 0.9101 | 0.6585 | 0.5120 | 0.2615 | 0.0224 | 0.8306 | 0.1668 | 0 |
|---|
| 17 | 1.0000 | 1.0000 | 1 | 1 | 1 | 0 | 1 | 0.8619 | 0.2902 | 0 | 0.7840 | 0.7416 | 0.3483 | 0.5281 | 0.4944 | 0.3415 | 0.7494 | 0.6004 | 0 | 0 | 0 | 1 |
|---|
| 18 | 0.7931 | 0.1538 | 0 | 1 | 1 | 1 | 1 | 0.6021 | 0.5411 | 0.8000 | 0.5265 | 0.6966 | 0.1236 | 0.6292 | 0.8202 | 0.5122 | 0.9536 | 0.1922 | 0.1227 | 0.3938 | 0.3164 | 1 |
|---|
| 19 | 0.8276 | 0.3846 | 1 | 1 | 1 | 1 | 1 | 0.3960 | 0.5258 | 1.0000 | 0.7316 | 0.9551 | 0.2584 | 0.4157 | 0.2022 | 0.7154 | 0.4219 | 0.2989 | 0.1992 | 0.5012 | 0.2011 | 1 |
|---|
| 20 | 0.1034 | 0.0769 | 1 | 0 | 0 | 0 | 1 | 0.2035 | 0.5155 | 0 | 0.1409 | 0.9213 | 0.2697 | 0.9888 | 0.6292 | 0.7154 | 0.8996 | 0.0636 | 0.1922 | 0.5001 | 0.5895 | 1 |
|---|
| 21 | 0.8621 | 0.0769 | 0 | 1 | 1 | 0 | 0 | 0.3221 | 0.1790 | 1.0000 | 0.1764 | 0.7865 | 0.8876 | 0.8989 | 0.1798 | 0.6423 | 0.1412 | 0.6541 | 0.1793 | 0.6570 | 0.3486 | 0 |
|---|
| 22 | 0.0345 | 0.2308 | 0 | 1 | 1 | 1 | 1 | 0.7641 | 0.9803 | 0 | 0.1200 | 0.4270 | 0.3146 | 0.4719 | 0.8764 | 0.3008 | 0.9451 | 0.4766 | 0 | 0 | 0 | 0 |
|---|
| 23 | 0.3103 | 0.0769 | 1 | 1 | 1 | 0 | 1 | 0.5456 | 0.6481 | 0.8000 | 0.0129 | 0.5955 | 0.7191 | 0.7303 | 0.0112 | 0.7154 | 0.2443 | 0.3666 | 0.0142 | 0.0226 | 0.2312 | 1 |
|---|
| 24 | 0.2414 | 0.5385 | 1 | 0 | 1 | 0 | 1 | 0.4550 | 0.6729 | 0 | 0.8868 | 0.7079 | 0.4157 | 0.2135 | 0.8090 | 0.4309 | 0.8275 | 0.8260 | 0.0259 | 0.4142 | 0.9413 | 1 |
|---|
| 25 | 0.0345 | 0.1538 | 1 | 1 | 1 | 0 | 1 | 0.3366 | 0.4029 | 0.2000 | 0.9676 | 0.5843 | 0.8652 | 1.0000 | 0.0225 | 0.8211 | 0.2027 | 0.8166 | 0.0033 | 0.4453 | 0.9297 | 1 |
|---|
| 26 | 0.3793 | 0.9231 | 1 | 1 | 0 | 0 | 0 | 0.4014 | 0.5931 | 1.0000 | 0.1263 | 0.8202 | 0.6742 | 0.5169 | 0.2247 | 0.5935 | 0.7954 | 0.4499 | 0 | 0 | 0 | 1 |
|---|
| 27 | 0.6552 | 0.9231 | 0 | 1 | 1 | 0 | 1 | 0.8827 | 0.1965 | 1.0000 | 0.0112 | 0.1685 | 1.0000 | 0.2809 | 0.3596 | 0.6667 | 0.6380 | 0.4421 | 0.1977 | 0.9090 | 0.4601 | 0 |
|---|
| 28 | 0.2414 | 0.8462 | 0 | 1 | 0 | 0 | 1 | 0.2462 | 0.9066 | 0.4000 | 0.7291 | 0.9101 | 0.8315 | 0.4045 | 0.5506 | 0.7642 | 0.4050 | 0.0383 | 0.0122 | 0.9070 | 0.0241 | 0 |
|---|
| 29 | 0.8621 | 0.3077 | 1 | 1 | 0 | 0 | 1 | 0.3345 | 0.7877 | 0.6000 | 0.2526 | 0.8876 | 0.6629 | 0.0337 | 0.6067 | 0.6341 | 0.2376 | 0.1553 | 0 | 0 | 0 | 1 |
|---|
| 30 | 0.5172 | 0.5385 | 0 | 1 | 0 | 0 | 1 | 0.3134 | 0.7796 | 1.0000 | 0.0878 | 0.3708 | 0.8764 | 0.9775 | 0.0674 | 0.3577 | 0.3548 | 0.2062 | 0.2055 | 0.4665 | 0.3889 | 0 |
|---|
| 31 | 0.7931 | 0.8462 | 1 | 1 | 0 | 0 | 1 | 0.2626 | 0.1522 | 1.0000 | 0.6688 | 0.1124 | 0.2247 | 0.5506 | 0.2584 | 0.6179 | 0.9944 | 0.7728 | 0.0080 | 0.1537 | 0.3012 | 0 |
|---|
| 32 | 0.8966 | 1.0000 | 1 | 1 | 1 | 0 | 0 | 0.4816 | 0.1518 | 0 | 0.8100 | 0.7978 | 1.0000 | 0.6404 | 0.3371 | 0.5447 | 0.4643 | 0.2394 | 0.1671 | 0.0827 | 0.0432 | 0 |
|---|
| 33 | 0.5172 | 0.2308 | 0 | 1 | 1 | 0 | 1 | 0.4934 | 0.6515 | 0.2000 | 0.5571 | 0.2809 | 0.1685 | 0.9326 | 0.2809 | 0.7398 | 0.5029 | 0.3464 | 0.0867 | 0.8865 | 0.8374 | 0 |
|---|
| 34 | 1.0000 | 0.8462 | 1 | 1 | 1 | 0 | 0 | 0.9637 | 0.6362 | 0.6000 | 0.2445 | 0.0899 | 0.4831 | 0.7079 | 0.1011 | 0.3902 | 0.9657 | 0.2650 | 0.1241 | 0.4717 | 0.3491 | 0 |
|---|
| 35 | 0.1724 | 0.3846 | 1 | 0 | 1 | 1 | 1 | 0.3787 | 0.2155 | 0.2000 | 0.1928 | 0.8876 | 0.3146 | 0.8090 | 0.1011 | 0.7805 | 0.2270 | 0.0762 | 0.1084 | 0.8338 | 0.2993 | 1 |
|---|
| 36 | 0.3103 | 0.5385 | 1 | 0 | 0 | 0 | 1 | 0.8027 | 0.2797 | 0.8000 | 0.2083 | 0.7079 | 0.4494 | 0.2697 | 0.4607 | 0.9919 | 0.9701 | 0.0429 | 0.0984 | 0.7273 | 0.9210 | 1 |
|---|
| 37 | 0.9655 | 0.7692 | 1 | 0 | 1 | 1 | 0 | 0.4801 | 0.1720 | 0.8000 | 0.9701 | 0.9775 | 0.3146 | 0.9775 | 0.9775 | 0.3984 | 0.0936 | 0.4070 | 0.1237 | 0.4156 | 0.3545 | 0 |
|---|
| 38 | 0.7586 | 0.5385 | 0 | 0 | 1 | 0 | 0 | 0.8467 | 0.9111 | 0 | 0.5898 | 0.0449 | 0.0787 | 0.2809 | 0.6966 | 0.6992 | 0.0393 | 0.4835 | 0.2116 | 0.3252 | 0.1468 | 0 |
|---|
| 39 | 0.2759 | 0.4615 | 0 | 1 | 0 | 0 | 0 | 0.0020 | 0.4474 | 0.8000 | 0.0194 | 0.2135 | 0.2584 | 0.1236 | 0.2360 | 0.8862 | 0.8004 | 0.0592 | 0.0076 | 0.0384 | 0.0017 | 0 |
|---|
| 40 | 0 | 0.6923 | 1 | 0 | 0 | 0 | 1 | 0.7645 | 0.1024 | 0.8000 | 0.7750 | 0.4719 | 0.5730 | 0.1011 | 0.3258 | 1.0000 | 0.7119 | 0.0853 | 0.1092 | 0.2721 | 0.5004 | 1 |
|---|
| 41 | 0.8966 | 0.8462 | 1 | 1 | 1 | 1 | 1 | 0.3991 | 0.0609 | 0.8000 | 0.6735 | 0.3371 | 0.9888 | 0.6742 | 0.0899 | 0.8618 | 0.8722 | 0.2666 | 0.0936 | 0.6764 | 0.7947 | 0 |
|---|
| 42 | 0.1724 | 0.0769 | 1 | 0 | 0 | 1 | 0 | 0.8125 | 0.4095 | 0.2000 | 0.6826 | 0.1910 | 0.0112 | 0.9551 | 0.6966 | 0.6911 | 0.2259 | 0.1717 | 0.0405 | 0.3806 | 0.2154 | 0 |
|---|
| 43 | 0.8966 | 0.0769 | 0 | 0 | 1 | 0 | 1 | 0.7953 | 0.8165 | 0.8000 | 0.0637 | 0.5955 | 0.6180 | 0.8315 | 0.6404 | 0.8374 | 0.2269 | 0.8831 | 0.0619 | 0.2499 | 0.3626 | 1 |
|---|
| 44 | 0.1379 | 0.0769 | 1 | 0 | 1 | 1 | 1 | 0.1532 | 0.7649 | 0.8000 | 0.6344 | 0.1011 | 0.4157 | 0.9101 | 0.4831 | 0.7805 | 0.4948 | 0.9627 | 0.1170 | 0.9008 | 0.8457 | 1 |
|---|
| 45 | 0.5172 | 1.0000 | 1 | 1 | 0 | 1 | 1 | 0.9806 | 0.0716 | 0.8000 | 0.0358 | 0.5618 | 0.0899 | 0.1011 | 0.2697 | 0.8862 | 0.5426 | 0.3518 | 0.1104 | 0.1132 | 0.4645 | 1 |
|---|
| 46 | 0.6207 | 0.8462 | 1 | 0 | 0 | 0 | 1 | 0.7847 | 0.6273 | 0 | 0.8753 | 0.8652 | 0.7528 | 0.8090 | 0.5056 | 0.5366 | 0.0482 | 0.7221 | 0.2033 | 0.3954 | 0.1972 | 0 |
|---|
| 47 | 0.4138 | 0.6923 | 0 | 0 | 0 | 0 | 1 | 0.0526 | 0.3295 | 0.4000 | 0.4296 | 0.3258 | 0.9888 | 0.0899 | 0.8764 | 0.3902 | 0.6253 | 0.6870 | 0.0954 | 0.1249 | 0.7826 | 1 |
|---|
| 48 | 0.3793 | 0.8462 | 0 | 1 | 1 | 0 | 1 | 0.2666 | 0.3873 | 0 | 0.7042 | 0.0112 | 0.0337 | 0.0337 | 0.4382 | 0.4228 | 0.6165 | 0.9964 | 0.0484 | 0.6488 | 0.3601 | 0 |
|---|
| 49 | 0.7586 | 0.8462 | 0 | 0 | 1 | 1 | 0 | 0.4360 | 0.9617 | 0 | 0.7223 | 0.1011 | 0.8876 | 0.2247 | 0.3483 | 0.8537 | 0.7090 | 0.5821 | 0.1818 | 0.9472 | 0.6743 | 1 |
|---|
| 50 | 0.9310 | 0.3077 | 0 | 0 | 0 | 0 | 1 | 0.3576 | 0.4253 | 0.4000 | 0.3378 | 0.7865 | 0.9438 | 0.5169 | 0.6292 | 0.5447 | 0.5712 | 0.0412 | 0.0165 | 0.8715 | 0.3002 | 0 |
|---|
| 51 | 0.0345 | 0.6923 | 0 | 0 | 1 | 1 | 0 | 0.2308 | 0.7181 | 1.0000 | 0.2269 | 0.3596 | 0.2584 | 0.8764 | 0.4831 | 0.8130 | 0.8103 | 0.8311 | 0.2026 | 0.2165 | 0.2981 | 1 |
|---|
| 52 | 0.8276 | 0.0769 | 0 | 0 | 1 | 1 | 1 | 0.1982 | 0.1978 | 0.8000 | 0.6211 | 0.2135 | 0.6854 | 0.8202 | 0.4607 | 0.7805 | 0.4237 | 0.1088 | 0.0116 | 0.3718 | 0.1137 | 0 |
|---|
| 53 | 0.3103 | 0.8462 | 0 | 0 | 0 | 1 | 1 | 0.2250 | 0.2718 | 0.6000 | 0.2088 | 0.6067 | 0.7865 | 0.1011 | 0.1910 | 0.7724 | 0.0991 | 0.3048 | 0.0036 | 0.7114 | 0.4092 | 0 |
|---|
| 54 | 0.9310 | 0.7692 | 0 | 0 | 0 | 1 | 1 | 0.4292 | 0.5036 | 0.4000 | 0.8937 | 0.0225 | 0.5169 | 0.6517 | 0.0562 | 0.9756 | 0.2113 | 0.2146 | 0.1025 | 0.8321 | 0.4753 | 0 |
|---|
| 55 | 0.5172 | 0.3846 | 1 | 0 | 0 | 0 | 0 | 0.4682 | 0.1025 | 0.8000 | 0.7852 | 0.5056 | 0.9551 | 0.2247 | 0.8315 | 0.5041 | 0.4984 | 0.4777 | 0.0954 | 0.0971 | 0.7492 | 0 |
|---|
| 56 | 0.6207 | 0.9231 | 0 | 0 | 1 | 0 | 0 | 0.5739 | 0.1556 | 0.8000 | 0.9343 | 0.1685 | 0.9888 | 0.2360 | 0 | 0.3333 | 0.1246 | 0.1736 | 0.0714 | 0.0664 | 0.5940 | 0 |
|---|
| 57 | 0.2414 | 0.6154 | 1 | 0 | 0 | 0 | 0 | 0.1566 | 0.6947 | 0 | 0.6819 | 0.4045 | 0.1573 | 0.4045 | 0.6292 | 0.3171 | 0.1937 | 0.7325 | 0.0469 | 0.2242 | 0.2272 | 1 |
|---|
| 58 | 0.5517 | 0.8462 | 0 | 1 | 0 | 0 | 1 | 0.3696 | 0.2768 | 1.0000 | 0.2095 | 0.3146 | 0.0112 | 0.9326 | 0.3596 | 0.8618 | 0.2919 | 0.1252 | 0.1568 | 0.0078 | 0.6631 | 1 |
|---|
| 59 | 0 | 0.3846 | 0 | 0 | 0 | 0 | 1 | 0.6233 | 0.6654 | 0.2000 | 0.3420 | 0.0787 | 0.1910 | 0.9101 | 0.5056 | 0.4146 | 0.4187 | 0.4975 | 0.0910 | 0.1447 | 0.9798 | 0 |
|---|
| 60 | 0.0345 | 0.1538 | 0 | 0 | 1 | 0 | 1 | 0.7074 | 0.2495 | 0.6000 | 0.2797 | 0.2697 | 0.7978 | 0.1685 | 0.0225 | 0.2927 | 0.8414 | 0.9588 | 0.0250 | 0.7910 | 0.0785 | 0 |
|---|
| 61 | 0.6552 | 0.3077 | 1 | 1 | 1 | 1 | 1 | 0.8950 | 0.8549 | 0 | 0.5402 | 0.3820 | 0.2360 | 0.4831 | 0.7640 | 0.3008 | 0.1601 | 0.3896 | 0.0261 | 0.3329 | 0.2017 | 1 |
|---|
| 62 | 0.6207 | 0.0769 | 0 | 1 | 0 | 1 | 1 | 0.2271 | 0.5141 | 0.6000 | 0.3390 | 0.2697 | 0.2135 | 0.7416 | 0.8315 | 0.6829 | 0.3042 | 0.1992 | 0.2025 | 0.3590 | 0.7945 | 1 |
|---|
| 63 | 0.3448 | 0.0769 | 0 | 1 | 1 | 0 | 0 | 0.5241 | 0.1211 | 0.6000 | 0.2444 | 0.5618 | 0.0337 | 0.2360 | 0.0337 | 0.4065 | 0.5163 | 0.4533 | 0.1985 | 0.5795 | 0.2571 | 0 |
|---|
| 64 | 0.5517 | 0.8462 | 0 | 1 | 1 | 0 | 1 | 0.8659 | 0.2341 | 1.0000 | 0.5304 | 0.9775 | 0.7640 | 0.3933 | 0.7191 | 0.8862 | 0.0882 | 0.1798 | 0.0796 | 0.9347 | 0.9626 | 0 |
|---|
| 65 | 0.4483 | 0.6923 | 1 | 1 | 1 | 1 | 1 | 0.9329 | 0.0521 | 1.0000 | 0.6354 | 0.2921 | 0.3596 | 0.0562 | 0.6854 | 0.8537 | 0.3504 | 0.9234 | 0.1757 | 0.9319 | 0.6844 | 0 |
|---|
| 66 | 0.8966 | 0.4615 | 1 | 1 | 1 | 0 | 0 | 0.1198 | 0.7609 | 0.6000 | 0.9485 | 0.5281 | 0.1573 | 0.8876 | 1.0000 | 0.9268 | 0.1458 | 0.0573 | 0.1695 | 0.2569 | 0.3195 | 1 |
|---|
| 67 | 0.1379 | 0.6923 | 1 | 0 | 1 | 1 | 0 | 0.2291 | 0.6986 | 1.0000 | 0.7945 | 0.6629 | 0.0562 | 0.2247 | 0.9551 | 0.3252 | 0.9806 | 0.5943 | 0.0249 | 0.3886 | 0.1268 | 0 |
|---|
| 68 | 0.6207 | 0 | 0 | 1 | 0 | 1 | 1 | 0.1744 | 0.5596 | 0.8000 | 0.5806 | 0.7191 | 0.1461 | 0.5730 | 0.8989 | 0.4228 | 0.4914 | 0.2418 | 0.0688 | 0.1171 | 0.6714 | 1 |
|---|
| 69 | 0.2759 | 0.0769 | 0 | 0 | 1 | 0 | 1 | 0.5480 | 0.8783 | 0 | 0.9333 | 0.9888 | 0.6180 | 0.5618 | 0.6629 | 0.4553 | 0.6060 | 0.3876 | 0.1026 | 0.9592 | 0.0302 | 0 |
|---|
| 70 | 0.1724 | 0.3846 | 1 | 0 | 0 | 1 | 1 | 0.2754 | 0.1342 | 0.8000 | 0.8278 | 0.0562 | 0.2022 | 0.8202 | 0.7865 | 0.4146 | 0.7799 | 0.6137 | 0.0609 | 0.5073 | 0.7858 | 1 |
|---|
| 71 | 0.2414 | 1.0000 | 1 | 1 | 0 | 1 | 0 | 0.1059 | 0.4158 | 1.0000 | 0.1443 | 0.9888 | 0.6629 | 0.8652 | 0.5843 | 0.5772 | 0.5269 | 0.3746 | 0.0987 | 0.4659 | 0.8688 | 0 |
|---|
| 72 | 0.7586 | 0.3846 | 0 | 0 | 1 | 0 | 1 | 0.7180 | 0.6420 | 1.0000 | 0.5212 | 0.4382 | 0.8652 | 0.8427 | 0.0112 | 0.7317 | 0.5586 | 0.9642 | 0.1893 | 0.3216 | 0.2351 | 0 |
|---|
| 73 | 0.3793 | 0.0769 | 1 | 0 | 1 | 0 | 0 | 0.0619 | 0.6537 | 0.6000 | 0.5012 | 0.2472 | 0.3034 | 0.4607 | 0.4944 | 0.8211 | 0.2728 | 0.7722 | 0.1487 | 0.4640 | 0.1944 | 1 |
|---|
| 74 | 1.0000 | 0.2308 | 0 | 1 | 1 | 1 | 1 | 0.1454 | 0.5514 | 0.4000 | 0.4385 | 0.8315 | 0.4944 | 0.6180 | 0.0674 | 1.0000 | 0.8667 | 0.0830 | 0.2111 | 0.5116 | 0.3912 | 1 |
|---|
| 75 | 0.4138 | 0.4615 | 1 | 0 | 1 | 1 | 1 | 0.4450 | 0.6702 | 0.8000 | 0.0618 | 0.9888 | 0.2022 | 0.9551 | 0.5393 | 0.5691 | 0.2213 | 0.0007 | 0.1394 | 0.5987 | 0.5632 | 1 |
|---|
| 76 | 0.9310 | 0.2308 | 1 | 0 | 0 | 0 | 1 | 0.5432 | 0.4275 | 1.0000 | 0.4390 | 0.6966 | 0.5843 | 0.8315 | 0.6292 | 0.3171 | 0.7676 | 0.9544 | 0.0056 | 0.1685 | 0.2125 | 0 |
|---|
| 77 | 0 | 0.1538 | 0 | 0 | 1 | 1 | 1 | 0.9883 | 0.3147 | 1.0000 | 0.5987 | 0.2584 | 0.7303 | 0.2135 | 0.6180 | 0.6829 | 0.1404 | 0.1142 | 0.1475 | 0.3731 | 0.2895 | 1 |
|---|
| 78 | 0 | 0.3077 | 1 | 1 | 1 | 0 | 1 | 0.6453 | 0.8337 | 0.8000 | 0.3198 | 0.2697 | 0.5506 | 0.3820 | 0.9101 | 0.7886 | 0.9279 | 0.4313 | 0.1639 | 0.5642 | 0.4970 | 0 |
|---|
| 79 | 0.0690 | 0.6923 | 1 | 0 | 1 | 1 | 0 | 0.1382 | 0.5442 | 0.2000 | 0.9565 | 0.1798 | 0.9888 | 0.1798 | 0.0787 | 0.9431 | 0.1420 | 0.9058 | 0.0195 | 0.9834 | 0.5352 | 1 |
|---|
| 80 | 0.2069 | 0.9231 | 1 | 0 | 1 | 0 | 1 | 0.6269 | 0.4080 | 0.8000 | 0.9755 | 0.6966 | 0.2247 | 0.7753 | 0.3146 | 0.5122 | 0.0456 | 0.3252 | 0.0528 | 0.4804 | 0.4098 | 1 |
|---|
| 81 | 1.0000 | 0.3077 | 1 | 1 | 0 | 1 | 0 | 0.2765 | 0.7185 | 1.0000 | 0.1713 | 0.2697 | 0.6854 | 0.1348 | 0.6404 | 0.7805 | 0.4751 | 0.0055 | 0.1526 | 0.2926 | 0.7740 | 0 |
|---|
| 82 | 0.0345 | 0.9231 | 1 | 0 | 1 | 0 | 1 | 0.8412 | 0.4145 | 0.2000 | 0.1519 | 0.6966 | 0.5169 | 0.4045 | 0.2809 | 0.9756 | 0.5014 | 0.0378 | 0.0457 | 0.2436 | 0.7992 | 0 |
|---|
| 83 | 0.8276 | 0.6154 | 0 | 0 | 0 | 0 | 1 | 0.6876 | 0.7791 | 1.0000 | 0.2973 | 0.3371 | 0.6404 | 0.7865 | 0.8427 | 0.3659 | 0.3518 | 0.8533 | 0.1437 | 0.5568 | 0.6018 | 0 |
|---|
| 84 | 0.8276 | 0.3077 | 1 | 1 | 1 | 1 | 1 | 0.3131 | 0.8621 | 0.4000 | 0.0791 | 0.7416 | 0.9326 | 0.6180 | 0.9775 | 0.5366 | 0.9734 | 0.1348 | 0.0759 | 0.2940 | 0.7635 | 1 |
|---|
| 85 | 0.0690 | 0.9231 | 0 | 1 | 0 | 1 | 1 | 0.9462 | 0.3339 | 0 | 0.2402 | 0.6180 | 0.2697 | 0.6067 | 0.3483 | 0.3089 | 0.9534 | 0.6850 | 0.0122 | 0.5530 | 0.9310 | 1 |
|---|
| 86 | 0.2414 | 0.6923 | 1 | 0 | 0 | 0 | 1 | 0.3552 | 0.5715 | 0 | 0.1513 | 1.0000 | 0.4719 | 0.7528 | 0.9775 | 0.5935 | 0.5779 | 0.7232 | 0.0139 | 0.3238 | 0.2328 | 0 |
|---|
| 87 | 0.6897 | 0.3846 | 1 | 1 | 1 | 1 | 0 | 0.4179 | 0.0630 | 0.4000 | 0.4377 | 0.3146 | 0.2247 | 0.6067 | 0.8539 | 0.6992 | 0.9725 | 0.5790 | 0.1098 | 0.7833 | 0.1185 | 1 |
|---|
| 88 | 0.2414 | 0.4615 | 1 | 0 | 0 | 0 | 1 | 0.3986 | 0.1145 | 0.2000 | 0.0996 | 0.2472 | 0.6067 | 0.6517 | 0.5281 | 0.2764 | 0.6712 | 0.2688 | 0.1566 | 0.5732 | 0.8276 | 1 |
|---|
| 89 | 0.1724 | 0.8462 | 1 | 1 | 0 | 0 | 1 | 0.8066 | 0.9401 | 0.8000 | 0.5140 | 0.7753 | 0.7303 | 0.7978 | 0.2472 | 0.5285 | 0.5687 | 0.1972 | 0.1902 | 0.6841 | 0.6951 | 0 |
|---|
| 90 | 0.8621 | 0.0769 | 0 | 0 | 1 | 0 | 0 | 0.3345 | 0.2010 | 0.6000 | 0.2604 | 0.0674 | 0.9326 | 0.1461 | 0.3596 | 0.4715 | 0.3201 | 0.8424 | 0.0807 | 0.4268 | 0.7085 | 0 |
|---|
| 91 | 0.8966 | 0.9231 | 0 | 1 | 1 | 1 | 0 | 0.3334 | 0.1418 | 0.4000 | 0.8980 | 0.6404 | 0.9438 | 0 | 0.3483 | 0.4715 | 0.6358 | 0.4929 | 0.0088 | 0.4450 | 0.2935 | 0 |
|---|
| 92 | 0.4483 | 0.2308 | 1 | 0 | 1 | 1 | 1 | 0.8125 | 0.0376 | 0.6000 | 0.4727 | 0.3483 | 0.0337 | 0.2697 | 0.3708 | 0.6016 | 0.4064 | 0.4819 | 0.0999 | 0.7092 | 0.5395 | 0 |
|---|
| 93 | 0.9655 | 0.2308 | 0 | 1 | 1 | 1 | 0 | 0.8912 | 0.2587 | 0 | 0.0501 | 0.7753 | 0.9551 | 0.8989 | 0.8652 | 0.6667 | 0.5269 | 0.2847 | 0 | 0 | 0 | 1 |
|---|
| 94 | 0.3793 | 0.3077 | 1 | 1 | 1 | 1 | 1 | 0.0821 | 0.1595 | 0.2000 | 0.2869 | 0.1348 | 0.3933 | 0.8539 | 0.5506 | 0.7154 | 0.7082 | 0.8415 | 0.0304 | 0.9187 | 0.4674 | 1 |
|---|
| 95 | 0.8966 | 0.3077 | 0 | 0 | 1 | 0 | 1 | 0.4681 | 0.2667 | 0.4000 | 0.3736 | 0.7640 | 0.2809 | 0.8764 | 0.6067 | 0.4390 | 0.2268 | 0.9873 | 0.1769 | 0.6873 | 0.6153 | 1 |
|---|
| 96 | 0.1724 | 0.4615 | 1 | 1 | 0 | 1 | 0 | 0.8026 | 0.8435 | 0.6000 | 0.7829 | 0.4719 | 0.2584 | 0.6067 | 0.2472 | 0.3089 | 0.3827 | 0.9693 | 0 | 0 | 0 | 1 |
|---|
| 97 | 0.2759 | 0.6923 | 1 | 1 | 0 | 0 | 0 | 0.4434 | 0.4206 | 0 | 0.9468 | 0.3483 | 0.2472 | 0.2809 | 0.8876 | 0.4634 | 0.7271 | 0.7847 | 0.2139 | 0.9619 | 0.4873 | 0 |
|---|
| 98 | 0.4138 | 0.6923 | 1 | 0 | 0 | 0 | 1 | 0.3623 | 0.1069 | 0.2000 | 0.7452 | 0.0899 | 0.3371 | 0.0787 | 0.4270 | 0.3008 | 0.2770 | 0.5346 | 0.0742 | 0.1533 | 0.1801 | 0 |
|---|
| 99 | 0.1379 | 0.9231 | 0 | 1 | 0 | 1 | 0 | 0.9547 | 0.6710 | 0.6000 | 0.0787 | 0.4157 | 0.7640 | 0.8202 | 0.9438 | 0.9350 | 0.6219 | 0.3204 | 0 | 0 | 0 | 0 |
|---|
| 100 | 0.9655 | 0.4615 | 1 | 1 | 0 | 0 | 1 | 0.6419 | 0.4627 | 0.4000 | 0.9043 | 1.0000 | 0.3371 | 0.2809 | 0.1236 | 0.4797 | 0.0673 | 0.1471 | 0.0554 | 0.1998 | 0.1518 | 0 |
|---|
| â‹® |
|---|
Thus, the efficiency of models with distinct features selected by method 1, 2, 3 and 5 will be determined.
SUPERVISED LEARNING
TRAINING SUPERVISED LEARNING MODEL (Categorical Variables can be specified)
The problem here is classification based so we use logistic regression and not linear regression.
Using Method 1 of feature selection
X1=[HCV_new.BMI, HCV_new.NauseaVomting, HCV_new.RNAEF];
[Xtrain1, Ytrain1, Xtest1, Ytest1]=trainTestSplit(X1,Y,0.7);
Why new classification models?
There is point implementing new methods only if they improve the classification results.
Logmodel=fitglm(Xtrain1,categorical(Ytrain1),'link','logit','Distribution',"binomial",'CategoricalVars',{'x2'});
pred1=predict(Logmodel,Xtest1);pred1(pred1<0.5)=0;pred1(pred1>=0.5)=1;
[accuracy1,precision1,recall1]=meas(categorical(pred1),Ytest1);
toc
Elapsed time is 0.076980 seconds.
%Random Forest (Classification)
RFmodel=TreeBagger(100,Xtrain1,categorical(Ytrain1),'MinLeafSize',10,'OOBPrediction',"on",'OOBPredictorImportance',"on",'CategoricalPredictors',[2]);
pred2=categorical(predict(RFmodel,Xtest1));
[accuracy2,precision2,recall2]=meas(pred2,Ytest1);
toc
Elapsed time is 2.735312 seconds.
%Decision Tree (Classification)
DecisionTree=fitctree(Xtrain1,categorical(Ytrain1),"CategoricalPredictors",[2]);
pred3=predict(DecisionTree,Xtest1);
[accuracy3,precision3,recall3]=meas(pred3,Ytest1);
toc
Elapsed time is 0.062668 seconds.
NBmodel=fitcnb(Xtrain1,categorical(Ytrain1),'PredictorNames',{'BMI','NauseaVomiting','RNAEF'},"CategoricalPredictors",'NauseaVomiting');
pred4=predict(NBmodel,Xtest1);
[accuracy4,precision4,recall4]=meas(pred4,Ytest1);
toc
Elapsed time is 0.044085 seconds.
%Support vector machine (SVM) classifier for one-class and binary classification
CSVM=fitcsvm(Xtrain1,categorical(Ytrain1),"CategoricalPredictors",[2]);
pred3=predict(CSVM,Xtest1);
[accuracy5,precision5,recall5]=meas(pred3,Ytest1);
toc
Elapsed time is 0.108909 seconds.
We can conclude that Naive Bayes has similar metrics as that of logistics regression, but takes least time for execution amongst the above mentioned classifiers. And so, I decided of executing the Naive Bayes Classifier as well.
5 fold cross validation
index1=crossvalind('kfold',size(Xtrain1,1),fold);
precision1=zeros(1,fold);
precision2=zeros(1,fold);
precision3=zeros(1,fold);
precision4=zeros(1,fold);
precision5=zeros(1,fold);
Logmodel=fitglm(Xtrain1,categorical(Ytrain1),'link','logit','Distribution',"binomial",'CategoricalVars',{'x2'});
pred1=predict(Logmodel,Xtest1);pred1(pred1<0.5)=0;pred1(pred1>=0.5)=1;
[accuracy1(1,i),precision1(1,i),recall1(1,i)]=meas(categorical(pred1),Ytest1);
%Random Forest (Classification)
RFmodel=TreeBagger(100,Xtrain1,categorical(Ytrain1),'MinLeafSize',10,'OOBPrediction',"on",'OOBPredictorImportance',"on",'CategoricalPredictors',[2]);
pred2=categorical(predict(RFmodel,Xtest1)); %pred2=double(categorical(pred2));
[accuracy2(1,i),precision2(1,i),recall2(1,i)]=meas(pred2,Ytest1);
%Decision Tree (Classification)
DecisionTree=fitctree(Xtrain1,categorical(Ytrain1),'PredictorNames',{'BMI','NauseaVomiting','RNAEF'},"CategoricalPredictors",'NauseaVomiting');
pred3=predict(DecisionTree,Xtest1);
[accuracy3(1,i),precision3(1,i),recall3(1,i)]=meas(pred3,Ytest1);
err(1,i)=cvloss(DecisionTree);
NBmodel=fitcnb(Xtrain1,categorical(Ytrain1),"CategoricalPredictors",[2]);
pred4=predict(NBmodel,Xtest1);
[accuracy4(1,i),precision4(1,i),recall4(1,i)]=meas(pred4,Ytest1);
%Multinomial Logistic Regression - Categorical Variables cannot be
[b5,dev5,stats5]=mnrfit(Xtrain1,categorical(Ytrain1));
[~,pred5]=max(mnrval(b5,Xtest1)');pred5=(pred5-1)';
[accuracy5(1,i),precision5(1,i),recall5(1,i)]=meas(categorical(pred5),Ytest1);
method1=[mean(accuracy1),mean(precision1),mean(recall1);...
mean(accuracy2),mean(precision2),mean(recall2);...
mean(accuracy3),mean(precision3),mean(recall3);...
mean(accuracy4),mean(precision4),mean(recall4);...
mean(accuracy5),mean(precision5),mean(recall5)]
0.5211 0.4832 0.3336
0.4943 0.5272 0.4205
0.4665 0.5521 0.4989
0.5273 0.4744 0.3404
0.5211 0.4832 0.3336
Using Method 2 of feature selection
[Xtrain2, Ytrain2, Xtest2, Ytest2]=trainTestSplit(X2, Y,0.7);
5 fold cross validation
index1=crossvalind('kfold',size(Xtrain2,1),fold);
precision1=zeros(1,fold);
precision2=zeros(1,fold);
precision3=zeros(1,fold);
precision4=zeros(1,fold);
precision5=zeros(1,fold);
Logmodel=fitglm(Xtrain2,categorical(Ytrain2),'link','logit','Distribution',"binomial",'CategoricalVars',{'x2'});
pred1=predict(Logmodel,Xtest2);pred1(pred1<0.5)=0;pred1(pred1>=0.5)=1;
[accuracy1(1,i),precision1(1,i),recall1(1,i)]=meas(categorical(pred1),Ytest2);
%Random Forest (Classification)
RFmodel=TreeBagger(100,Xtrain2,categorical(Ytrain2),'MinLeafSize',10,'OOBPrediction',"on",'OOBPredictorImportance',"on",'CategoricalPredictors',[2]);
pred2=categorical(predict(RFmodel,Xtest2));
[accuracy2(1,i),precision2(1,i),recall2(1,i)]=meas(pred2,Ytest2);
%Decision Tree (Classification)
DecisionTree=fitctree(Xtrain2,categorical(Ytrain2),'PredictorNames',{'BMI','NauseaVomiting'},"CategoricalPredictors",'NauseaVomiting');
pred3=predict(DecisionTree,Xtest2);
[accuracy3(1,i),precision3(1,i),recall3(1,i)]=meas(pred3,Ytest2);
err(1,i)=cvloss(DecisionTree);
NBmodel=fitcnb(Xtrain2,categorical(Ytrain2),"CategoricalPredictors",[2]);
pred4=predict(NBmodel,Xtest2);
[accuracy4(1,i),precision4(1,i),recall4(1,i)]=meas(pred4,Ytest2);
%Multinomial Logistic Regression - Categorical Variables cannot be
[b5,dev5,stats5]=mnrfit(Xtrain2,categorical(Ytrain2));
[~,pred5]=max(mnrval(b5,Xtest2)');pred5=(pred5-1)';
[accuracy5(1,i),precision5(1,i),recall5(1,i)]=meas(categorical(pred5),Ytest2);
method2=[mean(accuracy1),mean(precision1),mean(recall1);...
mean(accuracy2),mean(precision2),mean(recall2);...
mean(accuracy3),mean(precision3),mean(recall3);...
mean(accuracy4),mean(precision4),mean(recall4);...
mean(accuracy5),mean(precision5),mean(recall5)]
0.5108 0.4970 0.3712
0.4923 0.5249 0.4614
0.5026 0.5123 0.4430
0.5119 0.4928 0.3566
0.5108 0.4970 0.3712
Using Method 3 of feature selection
X3=table2array(HCV_new(:,1:end-1));
colmin = min(X3); colmax = max(X3);
X3 = rescale(X3, 'InputMin', colmin, 'InputMax', colmax);
[Xtrain3, Ytrain3, Xtest3, Ytest3]=trainTestSplit(X3, Y,0.7);
5 fold cross validation
index1=crossvalind('kfold',size(Xtrain3,1),fold);
precision1=zeros(1,fold);
precision2=zeros(1,fold);
precision3=zeros(1,fold);
precision4=zeros(1,fold);
precision5=zeros(1,fold);
Logmodel=fitglm(Xtrain3,categorical(Ytrain3),'link','logit','Distribution',"binomial",'CategoricalVars',{'x2','x4','x5','x6','x7','x8'});
pred1=predict(Logmodel,Xtest3);pred1(pred1<0.5)=0;pred1(pred1>=0.5)=1;
[accuracy1(1,i),precision1(1,i),recall1(1,i)]=meas(categorical(pred1),Ytest3);
%Random Forest (Classification)
RFmodel=TreeBagger(100,Xtrain3,categorical(Ytrain3),'MinLeafSize',10,'OOBPrediction',"on",'OOBPredictorImportance',"on",'CategoricalPredictors',[2,4,5,6,7,8]);
pred2=categorical(predict(RFmodel,Xtest3));
[accuracy2(1,i),precision2(1,i),recall2(1,i)]=meas(pred2,Ytest3);
%Decision Tree (Classification)
DecisionTree=fitctree(Xtrain3,categorical(Ytrain3),"CategoricalPredictors",[2,4,5,6,7,8]);
pred3=predict(DecisionTree,Xtest3);
[accuracy3(1,i),precision3(1,i),recall3(1,i)]=meas(pred3,Ytest3);
err(1,i)=cvloss(DecisionTree);
NBmodel=fitcnb(Xtrain3,categorical(Ytrain3),"CategoricalPredictors",[2,4,5,6,7,8]);
pred4=predict(NBmodel,Xtest3);
[accuracy4(1,i),precision4(1,i),recall4(1,i)]=meas(pred4,Ytest3);
%Multinomial Logistic Regression - Categorical Variables cannot be
[b5,dev5,stats5]=mnrfit(Xtrain3,categorical(Ytrain3));
[~,pred5]=max(mnrval(b5,Xtest3)');pred5=(pred5-1)';
[accuracy5(1,i),precision5(1,i),recall5(1,i)]=meas(categorical(pred5),Ytest3);
method3=[mean(accuracy1),mean(precision1),mean(recall1);...
mean(accuracy2),mean(precision2),mean(recall2);...
mean(accuracy3),mean(precision3),mean(recall3);...
mean(accuracy4),mean(precision4),mean(recall4);...
mean(accuracy5),mean(precision5),mean(recall5)]
0.4882 0.5309 0.4574
0.4881 0.5362 0.4281
0.5211 0.4942 0.4394
0.4778 0.5432 0.4385
0.4882 0.5309 0.4574
m1=array2table(method1,...
"RowNames",{'FSMethod1_glm','FSMethod1_TB','FSMethod1_DT','FSMethod1_NB','FSMethod1_MLR'},...
'VariableNames',{'Accuracy','Precision','Recall'})
m1 = 5×3 table
| | Accuracy | Precision | Recall |
|---|
| 1 FSMethod1_glm | 0.5211 | 0.4832 | 0.3336 |
|---|
| 2 FSMethod1_TB | 0.4943 | 0.5272 | 0.4205 |
|---|
| 3 FSMethod1_DT | 0.4665 | 0.5521 | 0.4989 |
|---|
| 4 FSMethod1_NB | 0.5273 | 0.4744 | 0.3404 |
|---|
| 5 FSMethod1_MLR | 0.5211 | 0.4832 | 0.3336 |
|---|
m2=array2table(method2,...
"RowNames",{'FSMethod2_glm','FSMethod2_TB','FSMethod2_DT','FSMethod2_NB','FSMethod2_MLR'},...
'VariableNames',{'Accuracy','Precision','Recall'})
m2 = 5×3 table
| | Accuracy | Precision | Recall |
|---|
| 1 FSMethod2_glm | 0.5108 | 0.4970 | 0.3712 |
|---|
| 2 FSMethod2_TB | 0.4923 | 0.5249 | 0.4614 |
|---|
| 3 FSMethod2_DT | 0.5026 | 0.5123 | 0.4430 |
|---|
| 4 FSMethod2_NB | 0.5119 | 0.4928 | 0.3566 |
|---|
| 5 FSMethod2_MLR | 0.5108 | 0.4970 | 0.3712 |
|---|
m3=array2table(method3,...
"RowNames",{'FSMethod3_glm','FSMethod3_TB','FSMethod3_DT','FSMethod3_NB','FSMethod3_MLR'},...
'VariableNames',{'Accuracy','Precision','Recall'})
m3 = 5×3 table
| | Accuracy | Precision | Recall |
|---|
| 1 FSMethod3_glm | 0.4882 | 0.5309 | 0.4574 |
|---|
| 2 FSMethod3_TB | 0.4881 | 0.5362 | 0.4281 |
|---|
| 3 FSMethod3_DT | 0.5211 | 0.4942 | 0.4394 |
|---|
| 4 FSMethod3_NB | 0.4778 | 0.5432 | 0.4385 |
|---|
| 5 FSMethod3_MLR | 0.4882 | 0.5309 | 0.4574 |
|---|
Using Method 5.1 of feature selection
X4=X4(:,1:end-1)
0 1.0000 0.4867 0.3597 0.1433 0.5056 0.4719 0 0.5280 0.0772 0
1.0000 0 1.0000 0.5100 0.2724 0.9438 0.8315 0.9593 0.4482 0.1707 0.4166
1.0000 0 0.1303 0.6695 0.4384 0.1124 0.8652 0 0.5503 0 0.9103
1.0000 0 0.3841 0.8138 0.4005 0.2809 0.5506 0.5854 0.3744 0.1570 0.9209
0 1.0000 0.0735 0.6572 0.7094 0.7303 0.9101 0.6911 0.6147 1.0000 0.4193
1.0000 0 0.9653 0.0549 0.2864 0.7303 0.2921 0.8862 0.9044 0 0
0 1.0000 0.9472 0.7744 0.6313 0.2022 0.7640 0.6098 0.8604 0.0737 0.2654
0 1.0000 0.4768 0.4904 0.9229 0.8202 0.0674 0.3902 0.0600 0.2110 0.4584
0 1.0000 0.8221 0.6589 0.4187 0.4944 0.6517 0.2764 0.6302 0 0.4590
0 1.0000 0.4050 0.5315 0.0389 0.3258 0.4719 0.3089 0.1918 0.0716 0.3405
[Xtrain51, Ytrain51, Xtest51, Ytest51]=trainTestSplit(X4, Y,0.7);
5 fold cross validation
index1=crossvalind('kfold',size(Xtrain51,1),fold);
precision1=zeros(1,fold);
precision2=zeros(1,fold);
precision3=zeros(1,fold);
precision4=zeros(1,fold);
precision5=zeros(1,fold);
Logmodel=fitglm(Xtrain51,categorical(Ytrain51),'link','logit','Distribution',"binomial",'CategoricalVars',[1,2]);
pred1=predict(Logmodel,Xtest51);pred1(pred1<0.5)=0;pred1(pred1>=0.5)=1;
[accuracy1(1,i),precision1(1,i),recall1(1,i)]=meas(categorical(pred1),Ytest51);
%Random Forest (Classification)
RFmodel=TreeBagger(100,Xtrain51,categorical(Ytrain51),'MinLeafSize',10,'OOBPrediction',"on",'OOBPredictorImportance',"on",'CategoricalPredictors',[1,2]);
pred2=categorical(predict(RFmodel,Xtest51));
[accuracy2(1,i),precision2(1,i),recall2(1,i)]=meas(pred2,Ytest51);
%Decision Tree (Classification)
DecisionTree=fitctree(Xtrain51,categorical(Ytrain51),"CategoricalPredictors",[1,2]);
pred3=predict(DecisionTree,Xtest51);
[accuracy3(1,i),precision3(1,i),recall3(1,i)]=meas(pred3,Ytest51);
err(1,i)=cvloss(DecisionTree);
NBmodel=fitcnb(Xtrain51,categorical(Ytrain51),"CategoricalPredictors",[1,2]);
pred4=predict(NBmodel,Xtest51);
[accuracy4(1,i),precision4(1,i),recall4(1,i)]=meas(pred4,Ytest51);
%Multinomial Logistic Regression - Categorical Variables cannot be
[b5,dev5,stats5]=mnrfit(Xtrain51,categorical(Ytrain51));
[~,pred5]=max(mnrval(b5,Xtest51)');pred5=(pred5-1)';
[accuracy5(1,i),precision5(1,i),recall5(1,i)]=meas(categorical(pred5),Ytest51);
method51=[mean(accuracy1),mean(precision1),mean(recall1);...
mean(accuracy2),mean(precision2),mean(recall2);...
mean(accuracy3),mean(precision3),mean(recall3);...
mean(accuracy4),mean(precision4),mean(recall4);...
mean(accuracy5),mean(precision5),mean(recall5)]
0.4954 0.5284 0.3922
0.4933 0.5296 0.4137
0.4902 0.5299 0.4781
0.5036 0.5181 0.3500
0.4954 0.5284 0.3922
m4=array2table(method51,...
"RowNames",{'FSMethod51_glm','FSMethod51_TB','FSMethod51_DT','FSMethod51_NB','FSMethod51_MLR'},...
'VariableNames',{'Accuracy','Precision','Recall'})
m4 = 5×3 table
| | Accuracy | Precision | Recall |
|---|
| 1 FSMethod51_glm | 0.4954 | 0.5284 | 0.3922 |
|---|
| 2 FSMethod51_TB | 0.4933 | 0.5296 | 0.4137 |
|---|
| 3 FSMethod51_DT | 0.4902 | 0.5299 | 0.4781 |
|---|
| 4 FSMethod51_NB | 0.5036 | 0.5181 | 0.3500 |
|---|
| 5 FSMethod51_MLR | 0.4954 | 0.5284 | 0.3922 |
|---|
Using Method 5.2 of feature selection
X5=X5(:,1:end-1)
0.8276 1.0000 0 0 0 1.0000 1.0000 0.4867 0.3597 0.8000 0.1433 0.5056 0.1461 0.7865 0.4719 0 0.5456 0.5280 0.0772 0
0.4828 0.5385 1.0000 1.0000 0 1.0000 0 1.0000 0.5100 0 0.2724 0.9438 0.6292 0.4045 0.8315 0.9593 0.0338 0.4482 0.1707 0.4166
0.8621 0.8462 1.0000 1.0000 1.0000 0 0 0.1303 0.6695 0.4000 0.4384 0.1124 0.6292 0.7640 0.8652 0 0.4755 0.5503 0 0.9103
0.5862 0.8462 1.0000 0 1.0000 1.0000 0 0.3841 0.8138 0 0.4005 0.2809 0.7865 0.4607 0.5506 0.5854 0.8675 0.3744 0.1570 0.9209
0.9310 0.7692 0 1.0000 0 1.0000 1.0000 0.0735 0.6572 0.2000 0.7094 0.7303 0.3146 0.1011 0.9101 0.6911 0.5498 0.6147 1.0000 0.4193
0.8966 0 1.0000 1.0000 0 1.0000 0 0.9653 0.0549 1.0000 0.2864 0.7303 0.9213 0.6404 0.2921 0.8862 0.9637 0.9044 0 0
0.3448 0.3077 0 1.0000 1.0000 1.0000 1.0000 0.9472 0.7744 0.4000 0.6313 0.2022 0.8315 0.8876 0.7640 0.6098 0.2712 0.8604 0.0737 0.2654
0.5517 0.6154 0 1.0000 1.0000 0 1.0000 0.4768 0.4904 0.2000 0.9229 0.8202 0.4607 0.9888 0.0674 0.3902 0.5338 0.0600 0.2110 0.4584
0.4138 0.0769 0 1.0000 1.0000 0 1.0000 0.8221 0.6589 0.4000 0.4187 0.4944 0.1798 0.7079 0.6517 0.2764 0.4924 0.6302 0 0.4590
0.4483 0.6154 0 1.0000 1.0000 0 1.0000 0.4050 0.5315 0.4000 0.0389 0.3258 0.3708 0.9888 0.4719 0.3089 0.9585 0.1918 0.0716 0.3405
[Xtrain51, Ytrain51, Xtest51, Ytest51]=trainTestSplit(X5, Y,0.7);
5 fold cross validation
index1=crossvalind('kfold',size(Xtrain51,1),fold);
precision1=zeros(1,fold);
precision2=zeros(1,fold);
precision3=zeros(1,fold);
precision4=zeros(1,fold);
precision5=zeros(1,fold);
Logmodel=fitglm(Xtrain52,categorical(Ytrain52),'link','logit','Distribution',"binomial",'CategoricalVars',[3:7]);
pred1=predict(Logmodel,Xtest52);pred1(pred1<0.5)=0;pred1(pred1>=0.5)=1;
[accuracy1(1,i),precision1(1,i),recall1(1,i)]=meas(categorical(pred1),Ytest52);
%Random Forest (Classification)
RFmodel=TreeBagger(100,Xtrain52,categorical(Ytrain52),'MinLeafSize',10,'OOBPrediction',"on",'OOBPredictorImportance',"on",'CategoricalPredictors',[3:7]);
pred2=categorical(predict(RFmodel,Xtest52));
[accuracy2(1,i),precision2(1,i),recall2(1,i)]=meas(pred2,Ytest52);
%Decision Tree (Classification)
DecisionTree=fitctree(Xtrain52,categorical(Ytrain52),"CategoricalPredictors",[3:7]);
pred3=predict(DecisionTree,Xtest52);
[accuracy3(1,i),precision3(1,i),recall3(1,i)]=meas(pred3,Ytest52);
err(1,i)=cvloss(DecisionTree);
NBmodel=fitcnb(Xtrain52,categorical(Ytrain52),"CategoricalPredictors",[3:7]);
pred4=predict(NBmodel,Xtest52);
[accuracy4(1,i),precision4(1,i),recall4(1,i)]=meas(pred4,Ytest52);
%Multinomial Logistic Regression - Categorical Variables cannot be
[b5,dev5,stats5]=mnrfit(Xtrain52,categorical(Ytrain52));
[~,pred5]=max(mnrval(b5,Xtest52)');pred5=(pred5-1)';
[accuracy5(1,i),precision5(1,i),recall5(1,i)]=meas(categorical(pred5),Ytest52);
method52=[mean(accuracy1),mean(precision1),mean(recall1);...
mean(accuracy2),mean(precision2),mean(recall2);...
mean(accuracy3),mean(precision3),mean(recall3);...
mean(accuracy4),mean(precision4),mean(recall4);...
mean(accuracy5),mean(precision5),mean(recall5)]
0.5015 0.5130 0.4205
0.5067 0.5095 0.4160
0.4994 0.5193 0.4915
0.4861 0.5344 0.4192
0.5015 0.5130 0.4205
m5=array2table(method52,...
"RowNames",{'FSMethod52_glm','FSMethod52_TB','FSMethod52_DT','FSMethod52_NB','FSMethod52_MLR'},...
'VariableNames',{'Accuracy','Precision','Recall'})
m5 = 5×3 table
| | Accuracy | Precision | Recall |
|---|
| 1 FSMethod52_glm | 0.5015 | 0.5130 | 0.4205 |
|---|
| 2 FSMethod52_TB | 0.5067 | 0.5095 | 0.4160 |
|---|
| 3 FSMethod52_DT | 0.4994 | 0.5193 | 0.4915 |
|---|
| 4 FSMethod52_NB | 0.4861 | 0.5344 | 0.4192 |
|---|
| 5 FSMethod52_MLR | 0.5015 | 0.5130 | 0.4205 |
|---|
Lasso Regression
[b_lasso, fit_lasso]=lasso(Xtrain1,Ytrain1,"CV",5);
pred6=round(Xtest1*b_lasso(:,1) + fit_lasso.Intercept(fit_lasso.IndexMinMSE));
[accuracy6,precision6,recall6]=meas(categorical(pred6),Ytest1);
method1(6,:)=[accuracy6,precision6,recall6]
0.5211 0.4832 0.3336
0.4943 0.5272 0.4205
0.4665 0.5521 0.4989
0.5273 0.4744 0.3404
0.5211 0.4832 0.3336
0.5130 0.4932 0.3673
[b_lasso, fit_lasso]=lasso(Xtrain2,Ytrain2,"CV",5);
pred6=round(Xtest2*b_lasso(:,1) + fit_lasso.Intercept(fit_lasso.IndexMinMSE));
[accuracy6,precision6,recall6]=meas(categorical(pred6),Ytest2);
method2(6,:)=[accuracy6,precision6,recall6]
0.5108 0.4970 0.3712
0.4923 0.5249 0.4614
0.5026 0.5123 0.4430
0.5119 0.4928 0.3566
0.5108 0.4970 0.3712
0.5258 0.5476 0.4182
[b_lasso, fit_lasso]=lasso(Xtrain3,Ytrain3,"CV",5);
pred6=round(Xtest3*b_lasso(:,1) + fit_lasso.Intercept(fit_lasso.IndexMinMSE));
[accuracy6,precision6,recall6]=meas(categorical(pred6),Ytest3);
method3(6,:)=[accuracy6,precision6,recall6]
0.4882 0.5309 0.4574
0.4881 0.5362 0.4281
0.5211 0.4942 0.4394
0.4778 0.5432 0.4385
0.4882 0.5309 0.4574
0.5206 0.5098 0.2549
[b_lasso, fit_lasso]=lasso(Xtrain51,Ytrain51,"CV",5);
pred6=round(Xtest51*b_lasso(:,1) + fit_lasso.Intercept(fit_lasso.IndexMinMSE));
[accuracy6,precision6,recall6]=meas(categorical(pred6),Ytest51);
method51(6,:)=[accuracy6,precision6,recall6]
0.4954 0.5284 0.3922
0.4933 0.5296 0.4137
0.4902 0.5299 0.4781
0.5036 0.5181 0.3500
0.4954 0.5284 0.3922
0.4880 0.5368 0.8565
[b_lasso, fit_lasso]=lasso(Xtrain52,Ytrain52,"CV",5);
pred6=round(Xtest52*b_lasso(:,1) + fit_lasso.Intercept(fit_lasso.IndexMinMSE));
[accuracy6,precision6,recall6]=meas(categorical(pred6),Ytest52);
method52(6,:)=[accuracy6,precision6,recall6]
0.5015 0.5130 0.4205
0.5067 0.5095 0.4160
0.4994 0.5193 0.4915
0.4861 0.5344 0.4192
0.5015 0.5130 0.4205
0.4715 0.5773 0.5283
classifierused={'Logistic Regression','Tree Bagger','Decision Tree','Naive Bayes','Multinomial Regression','Lasso Regression'};
array2table([method1(i,:);method2(i,:);method3(i,:);method51(i,:);method52(i,:)],...
'VariableNames',{'Accuracy','Precision','Recall'},'Rownames',...
{'TTest based','Step based','Reduction based','PCA(PC1-2) based','PCA(PC1-4) based'})
end
ans =
{'Logistic Regression'}
ans = 5×3 table
| | Accuracy | Precision | Recall |
|---|
| 1 TTest based | 0.5211 | 0.4832 | 0.3336 |
|---|
| 2 Step based | 0.5108 | 0.4970 | 0.3712 |
|---|
| 3 Reduction based | 0.4882 | 0.5309 | 0.4574 |
|---|
| 4 PCA(PC1-2) based | 0.4954 | 0.5284 | 0.3922 |
|---|
| 5 PCA(PC1-4) based | 0.5015 | 0.5130 | 0.4205 |
|---|
ans = 5×3 table
| | Accuracy | Precision | Recall |
|---|
| 1 TTest based | 0.4943 | 0.5272 | 0.4205 |
|---|
| 2 Step based | 0.4923 | 0.5249 | 0.4614 |
|---|
| 3 Reduction based | 0.4881 | 0.5362 | 0.4281 |
|---|
| 4 PCA(PC1-2) based | 0.4933 | 0.5296 | 0.4137 |
|---|
| 5 PCA(PC1-4) based | 0.5067 | 0.5095 | 0.4160 |
|---|
ans = 5×3 table
| | Accuracy | Precision | Recall |
|---|
| 1 TTest based | 0.4665 | 0.5521 | 0.4989 |
|---|
| 2 Step based | 0.5026 | 0.5123 | 0.4430 |
|---|
| 3 Reduction based | 0.5211 | 0.4942 | 0.4394 |
|---|
| 4 PCA(PC1-2) based | 0.4902 | 0.5299 | 0.4781 |
|---|
| 5 PCA(PC1-4) based | 0.4994 | 0.5193 | 0.4915 |
|---|
ans = 5×3 table
| | Accuracy | Precision | Recall |
|---|
| 1 TTest based | 0.5273 | 0.4744 | 0.3404 |
|---|
| 2 Step based | 0.5119 | 0.4928 | 0.3566 |
|---|
| 3 Reduction based | 0.4778 | 0.5432 | 0.4385 |
|---|
| 4 PCA(PC1-2) based | 0.5036 | 0.5181 | 0.3500 |
|---|
| 5 PCA(PC1-4) based | 0.4861 | 0.5344 | 0.4192 |
|---|
ans =
{'Multinomial Regression'}
ans = 5×3 table
| | Accuracy | Precision | Recall |
|---|
| 1 TTest based | 0.5211 | 0.4832 | 0.3336 |
|---|
| 2 Step based | 0.5108 | 0.4970 | 0.3712 |
|---|
| 3 Reduction based | 0.4882 | 0.5309 | 0.4574 |
|---|
| 4 PCA(PC1-2) based | 0.4954 | 0.5284 | 0.3922 |
|---|
| 5 PCA(PC1-4) based | 0.5015 | 0.5130 | 0.4205 |
|---|
ans =
{'Lasso Regression'}
ans = 5×3 table
| | Accuracy | Precision | Recall |
|---|
| 1 TTest based | 0.5130 | 0.4932 | 0.3673 |
|---|
| 2 Step based | 0.5258 | 0.5476 | 0.4182 |
|---|
| 3 Reduction based | 0.5206 | 0.5098 | 0.2549 |
|---|
| 4 PCA(PC1-2) based | 0.4880 | 0.5368 | 0.8565 |
|---|
| 5 PCA(PC1-4) based | 0.4715 | 0.5773 | 0.5283 |
|---|
res(i,:)=[method1(i,:),method2(i,:),method3(i,:),method51(i,:),method52(i,:)];
array2table(res','VariableNames',classifierused,'RowNames',...
{'TTest based Accuracy','TTest based Precision','TTest based Recall',...
'Step based Accuracy','Step based Precision','Step based Recall',....
'Reduction based Accuracy','Reduction based Precision','Reduction based Recall',...
'PCA(PC1-2) based Accuracy','PCA(PC1-2) based Precision','PCA(PC1-2) based Recall',...
'PCA(PC1-4) based Accuracy','PCA(PC1-4) based Precision','PCA(PC1-4) based Recall'})
ans = 15×6 table
| | Logistic Regression | Tree Bagger | Decision Tree | Naive Bayes | Multinomial Regression | Lasso Regression |
|---|
| 1 TTest based Accuracy | 0.5211 | 0.4943 | 0.4665 | 0.5273 | 0.5211 | 0.5130 |
|---|
| 2 TTest based Precision | 0.4832 | 0.5272 | 0.5521 | 0.4744 | 0.4832 | 0.4932 |
|---|
| 3 TTest based Recall | 0.3336 | 0.4205 | 0.4989 | 0.3404 | 0.3336 | 0.3673 |
|---|
| 4 Step based Accuracy | 0.5108 | 0.4923 | 0.5026 | 0.5119 | 0.5108 | 0.5258 |
|---|
| 5 Step based Precision | 0.4970 | 0.5249 | 0.5123 | 0.4928 | 0.4970 | 0.5476 |
|---|
| 6 Step based Recall | 0.3712 | 0.4614 | 0.4430 | 0.3566 | 0.3712 | 0.4182 |
|---|
| 7 Reduction based Accuracy | 0.4882 | 0.4881 | 0.5211 | 0.4778 | 0.4882 | 0.5206 |
|---|
| 8 Reduction based Precision | 0.5309 | 0.5362 | 0.4942 | 0.5432 | 0.5309 | 0.5098 |
|---|
| 9 Reduction based Recall | 0.4574 | 0.4281 | 0.4394 | 0.4385 | 0.4574 | 0.2549 |
|---|
| 10 PCA(PC1-2) based Accuracy | 0.4954 | 0.4933 | 0.4902 | 0.5036 | 0.4954 | 0.4880 |
|---|
| 11 PCA(PC1-2) based Precision | 0.5284 | 0.5296 | 0.5299 | 0.5181 | 0.5284 | 0.5368 |
|---|
| 12 PCA(PC1-2) based Recall | 0.3922 | 0.4137 | 0.4781 | 0.3500 | 0.3922 | 0.8565 |
|---|
| 13 PCA(PC1-4) based Accuracy | 0.5015 | 0.5067 | 0.4994 | 0.4861 | 0.5015 | 0.4715 |
|---|
| 14 PCA(PC1-4) based Precision | 0.5130 | 0.5095 | 0.5193 | 0.5344 | 0.5130 | 0.5773 |
|---|
| 15 PCA(PC1-4) based Recall | 0.4205 | 0.4160 | 0.4915 | 0.4192 | 0.4205 | 0.5283 |
|---|
UNSUPERVISED LEARNING
K-means Clustering
Here, we define k=2 because the goal of the study is to determine advanced fibrosis only. So the expected answer is yes/no.
K-medoids on Binary data only
[idx,c]=kmedoids(x1,2,'Distance',"hamming");
title('K-medoids on Binary Data only')
% Visualize the clustering for say
plot(x1(idx==1,5),x1(idx==1,8),'r.','MarkerSize',12)
plot(x1(idx==2,5),x1(idx==2,8),'b.','MarkerSize',12)
plot(c(:,5),c(:,8),'kx',...
'MarkerSize',15,'LineWidth',3)
legend('Cluster 1','Cluster 2','Centroids',...
title 'Cluster Assignments and Centroids'
[ac,pr,re]=meas(categorical(idx),outp);
confusionmat(double(idx),double(outp))
Class List in given sample
1
2
Total Instance = 1385
class1==>1
class2==>2
Confusion Matrix
predict_class1 predict_class2
______________ ______________
Actual_class1 398 441
Actual_class2 270 276
Two-Class Confution Matrix
'' 'TruePositive' 'FalsePositive'
'FalseNegative' [ 398] [ 441]
'TrueNegative=TN' [ 270] [ 276]
Over all valuses
Accuracy: 0.4866
Error: 0.5134
Sensitivity: 0.4744
Specificity: 0.5055
Precision: 0.5958
FalsePositiveRate: 0.4945
F1_score: 0.5282
MatthewsCorrelationCoefficient: 0.0197
Kappa: 0.0187
K-means on continuous data only
title('K-means on continuous data only')
% Visualize the clustering for say
plot(x1(idx==1,5),x1(idx==1,8),'r.','MarkerSize',12)
plot(x1(idx==2,5),x1(idx==2,8),'b.','MarkerSize',12)
plot(c(:,5),c(:,8),'kx',...
'MarkerSize',15,'LineWidth',3)
legend('Cluster 1','Cluster 2','Centroids',...
title 'Cluster Assignments and Centroids'
[ac,pr,re]=meas(categorical(idx),outp);
confusionmat(double(idx),double(outp))
Class List in given sample
1
2
Total Instance = 1385
class1==>1
class2==>2
Confusion Matrix
predict_class1 predict_class2
______________ ______________
Actual_class1 366 411
Actual_class2 302 306
Two-Class Confution Matrix
'' 'TruePositive' 'FalsePositive'
'FalseNegative' [ 366] [ 411]
'TrueNegative=TN' [ 302] [ 306]
Over all valuses
Accuracy: 0.4852
Error: 0.5148
Sensitivity: 0.4710
Specificity: 0.5033
Precision: 0.5479
FalsePositiveRate: 0.4967
F1_score: 0.5066
MatthewsCorrelationCoefficient: 0.0255
Kappa: 0.0246
K-means on entire data
title('K-means on entire data')
% Visualize the clustering for say
plot(X(idx==1,5),X(idx==1,27),'r.','MarkerSize',12)
plot(X(idx==2,5),X(idx==2,27),'b.','MarkerSize',12)
plot(c(:,5),c(:,27),'kx',...
'MarkerSize',15,'LineWidth',3)
legend('Cluster 1','Cluster 2','Centroids',...
title 'Cluster Assignments and Centroids'
[ac,pr,re]=meas(categorical(idx),outp);
confusionmat(double(idx),double(outp))
Class List in given sample
1
2
Total Instance = 1385
class1==>1
class2==>2
Confusion Matrix
predict_class1 predict_class2
______________ ______________
Actual_class1 347 347
Actual_class2 321 370
Two-Class Confution Matrix
'' 'TruePositive' 'FalsePositive'
'FalseNegative' [ 347] [ 347]
'TrueNegative=TN' [ 321] [ 370]
Over all valuses
Accuracy: 0.5177
Error: 0.4823
Sensitivity: 0.5000
Specificity: 0.5355
Precision: 0.5195
FalsePositiveRate: 0.4645
F1_score: 0.5095
MatthewsCorrelationCoefficient: 0.0355
Kappa: 0.0355
K-medoids on entire data
title('K-medoids on entire data')
% Visualize the clustering for say
plot(X(idx==1,5),X(idx==1,27),'r.','MarkerSize',12)
plot(X(idx==2,5),X(idx==2,27),'b.','MarkerSize',12)
plot(c(:,5),c(:,27),'kx',...
'MarkerSize',15,'LineWidth',3)
legend('Cluster 1','Cluster 2','Centroids',...
title 'Cluster Assignments and Centroids'
[ac,pr,re]=meas(categorical(idx),outp);
confusionmat(double(idx),double(outp))
Class List in given sample
1
2
Total Instance = 1385
class1==>1
class2==>2
Confusion Matrix
predict_class1 predict_class2
______________ ______________
Actual_class1 418 400
Actual_class2 250 317
Two-Class Confution Matrix
'' 'TruePositive' 'FalsePositive'
'FalseNegative' [ 418] [ 400]
'TrueNegative=TN' [ 250] [ 317]
Over all valuses
Accuracy: 0.5307
Error: 0.4693
Sensitivity: 0.5110
Specificity: 0.5591
Precision: 0.6257
FalsePositiveRate: 0.4409
F1_score: 0.5626
MatthewsCorrelationCoefficient: 0.0690
Kappa: 0.0674
array2table([m1;m2;m3;m4],'RowNames',{'K-medoids_Binary','K-means_Continuous','K-means_Entire','K-medoids_Entire'},...
'VariableNames',{'Accuracy','Precision','Recall'})
ans = 4×3 table
| | Accuracy | Precision | Recall |
|---|
| 1 K-medoids_Binary | 0.4866 | 0.4744 | 0.5958 |
|---|
| 2 K-means_Continuous | 0.4852 | 0.4710 | 0.5479 |
|---|
| 3 K-means_Entire | 0.5177 | 0.5000 | 0.5195 |
|---|
| 4 K-medoids_Entire | 0.5307 | 0.5110 | 0.6257 |
|---|
Compare FS Method models with Random Guess
% compare with 100 random predictions
random_pred = Ytrain1(randperm(length(Ytest1)));
accuracy_guess(i,1) = sum(random_pred == Ytest1)/length(Ytest1);
precision_guess(i,1) = sum(random_pred==1 & Ytest1==1)/sum(random_pred==1);
recall_guess(i,1) = sum(random_pred==1 & Ytest1==1)/sum(Ytest1==1);
histogram(accuracy_guess)
% Naive bayes performed with TTest feature selection gives maximum accuracy
[h_acc, p_acc] = ttest2(method1(4,1), accuracy_guess)
[h_pre, p_pre] = ttest2(method1(4,2), precision_guess)
[h_rec, p_rec] = ttest2(method1(4,3), recall_guess)
h_rec = 1
p_rec = 3.1421e-07
% K-medoids on entire data performed
[h_acc, p_acc] = ttest2(m4(1,1), accuracy_guess)
[h_pre, p_pre] = ttest2(m4(1,2), precision_guess)
[h_rec, p_rec] = ttest2(m4(1,3), recall_guess)
function [acc,prec,rec]=meas(p,y)
acc=sum(p==categorical(y))/length(y);
prec=sum(p==1 & y==1)/sum(p==1);
rec=sum(p==1 & y==1)/sum(y==1);